Dec 05 05:52:59 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 05:52:59 crc restorecon[4585]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:52:59 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 05:53:00 crc restorecon[4585]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 05:53:00 crc kubenswrapper[4865]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 05:53:00 crc kubenswrapper[4865]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 05:53:00 crc kubenswrapper[4865]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 05:53:00 crc kubenswrapper[4865]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 05:53:00 crc kubenswrapper[4865]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 05:53:00 crc kubenswrapper[4865]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.818425 4865 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821695 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821716 4865 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821722 4865 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821731 4865 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821749 4865 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821757 4865 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821761 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821766 4865 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821771 4865 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821775 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821781 4865 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821785 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821791 4865 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821795 4865 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821799 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821803 4865 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821807 4865 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821812 4865 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821817 4865 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821838 4865 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821843 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821848 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821854 4865 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821860 4865 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821866 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821871 4865 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821876 4865 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821880 4865 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821886 4865 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821892 4865 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821898 4865 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821902 4865 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821906 4865 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821909 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821913 4865 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821916 4865 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821921 4865 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821924 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821928 4865 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821932 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821937 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821943 4865 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821947 4865 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821951 4865 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821955 4865 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821959 4865 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821963 4865 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821967 4865 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821971 4865 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821975 4865 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821979 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821985 4865 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821989 4865 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821993 4865 feature_gate.go:330] unrecognized feature gate: Example Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.821998 4865 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822001 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822005 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822009 4865 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822013 4865 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822017 4865 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822020 4865 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822024 4865 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822028 4865 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822031 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822035 4865 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822038 4865 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822042 4865 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822045 4865 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822049 4865 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822054 4865 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.822057 4865 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822280 4865 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822291 4865 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822299 4865 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822306 4865 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822312 4865 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822318 4865 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822324 4865 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822330 4865 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822335 4865 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822339 4865 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822344 4865 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822348 4865 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822353 4865 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822357 4865 flags.go:64] FLAG: --cgroup-root="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822361 4865 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822366 4865 flags.go:64] FLAG: --client-ca-file="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822370 4865 flags.go:64] FLAG: --cloud-config="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822374 4865 flags.go:64] FLAG: --cloud-provider="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822378 4865 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822383 4865 flags.go:64] FLAG: --cluster-domain="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822387 4865 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822391 4865 flags.go:64] FLAG: --config-dir="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822395 4865 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822400 4865 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822406 4865 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822410 4865 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822414 4865 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822418 4865 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822423 4865 flags.go:64] FLAG: --contention-profiling="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822428 4865 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822432 4865 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822436 4865 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822440 4865 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822446 4865 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822450 4865 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822454 4865 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822458 4865 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822462 4865 flags.go:64] FLAG: --enable-server="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822467 4865 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822472 4865 flags.go:64] FLAG: --event-burst="100" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822477 4865 flags.go:64] FLAG: --event-qps="50" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822482 4865 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822486 4865 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822490 4865 flags.go:64] FLAG: --eviction-hard="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822495 4865 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822501 4865 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822505 4865 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822509 4865 flags.go:64] FLAG: --eviction-soft="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822513 4865 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822517 4865 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822521 4865 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822525 4865 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822529 4865 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822533 4865 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822538 4865 flags.go:64] FLAG: --feature-gates="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822543 4865 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822547 4865 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822551 4865 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822555 4865 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822560 4865 flags.go:64] FLAG: --healthz-port="10248" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822564 4865 flags.go:64] FLAG: --help="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822568 4865 flags.go:64] FLAG: --hostname-override="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822572 4865 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822576 4865 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822580 4865 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822584 4865 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822588 4865 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822593 4865 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822597 4865 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822601 4865 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822605 4865 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822609 4865 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822613 4865 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822617 4865 flags.go:64] FLAG: --kube-reserved="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822621 4865 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822625 4865 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822630 4865 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822634 4865 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822639 4865 flags.go:64] FLAG: --lock-file="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822643 4865 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822647 4865 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822651 4865 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822658 4865 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822663 4865 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822668 4865 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822673 4865 flags.go:64] FLAG: --logging-format="text" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822678 4865 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822684 4865 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822689 4865 flags.go:64] FLAG: --manifest-url="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822694 4865 flags.go:64] FLAG: --manifest-url-header="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822701 4865 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822707 4865 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822713 4865 flags.go:64] FLAG: --max-pods="110" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822717 4865 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822723 4865 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822728 4865 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822734 4865 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822740 4865 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822745 4865 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822751 4865 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822761 4865 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822765 4865 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822769 4865 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822774 4865 flags.go:64] FLAG: --pod-cidr="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822778 4865 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822785 4865 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822789 4865 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822793 4865 flags.go:64] FLAG: --pods-per-core="0" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822798 4865 flags.go:64] FLAG: --port="10250" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822802 4865 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822806 4865 flags.go:64] FLAG: --provider-id="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822810 4865 flags.go:64] FLAG: --qos-reserved="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822814 4865 flags.go:64] FLAG: --read-only-port="10255" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822836 4865 flags.go:64] FLAG: --register-node="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822842 4865 flags.go:64] FLAG: --register-schedulable="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822848 4865 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822857 4865 flags.go:64] FLAG: --registry-burst="10" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822862 4865 flags.go:64] FLAG: --registry-qps="5" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822867 4865 flags.go:64] FLAG: --reserved-cpus="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822873 4865 flags.go:64] FLAG: --reserved-memory="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822879 4865 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822885 4865 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822890 4865 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822896 4865 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822901 4865 flags.go:64] FLAG: --runonce="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822906 4865 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822911 4865 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822916 4865 flags.go:64] FLAG: --seccomp-default="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822920 4865 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822929 4865 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822934 4865 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822938 4865 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822944 4865 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822948 4865 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822959 4865 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822967 4865 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822972 4865 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822977 4865 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822983 4865 flags.go:64] FLAG: --system-cgroups="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822987 4865 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.822996 4865 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823001 4865 flags.go:64] FLAG: --tls-cert-file="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823006 4865 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823012 4865 flags.go:64] FLAG: --tls-min-version="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823017 4865 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823021 4865 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823026 4865 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823030 4865 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823035 4865 flags.go:64] FLAG: --v="2" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823044 4865 flags.go:64] FLAG: --version="false" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823051 4865 flags.go:64] FLAG: --vmodule="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823057 4865 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823062 4865 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823186 4865 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823192 4865 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823197 4865 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823201 4865 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823206 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823210 4865 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823214 4865 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823218 4865 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823223 4865 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823226 4865 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823230 4865 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823233 4865 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823237 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823241 4865 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823244 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823248 4865 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823251 4865 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823255 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823258 4865 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823262 4865 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823265 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823270 4865 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823275 4865 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823279 4865 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823282 4865 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823287 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823291 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823295 4865 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823299 4865 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823303 4865 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823307 4865 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823311 4865 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823320 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823324 4865 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823328 4865 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823331 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823335 4865 feature_gate.go:330] unrecognized feature gate: Example Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823338 4865 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823342 4865 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823346 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823351 4865 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823354 4865 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823358 4865 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823361 4865 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823364 4865 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823368 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823371 4865 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823375 4865 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823378 4865 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823382 4865 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823385 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823389 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823392 4865 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823396 4865 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823399 4865 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823402 4865 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823406 4865 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823409 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823413 4865 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823416 4865 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823420 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823423 4865 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823427 4865 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823431 4865 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823436 4865 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823440 4865 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823443 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823447 4865 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823456 4865 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823463 4865 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.823467 4865 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.823591 4865 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.832710 4865 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.832742 4865 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833121 4865 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833151 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833160 4865 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833170 4865 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833178 4865 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833186 4865 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833193 4865 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833201 4865 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833208 4865 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833216 4865 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833225 4865 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833232 4865 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833239 4865 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833245 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833252 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833259 4865 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833266 4865 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833273 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833279 4865 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833285 4865 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833293 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833300 4865 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833309 4865 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833319 4865 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833327 4865 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833335 4865 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833342 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833350 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833359 4865 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833367 4865 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833374 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833382 4865 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833390 4865 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833397 4865 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833413 4865 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833420 4865 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833427 4865 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833435 4865 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833441 4865 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833448 4865 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833457 4865 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833466 4865 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833474 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833481 4865 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833490 4865 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833497 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833504 4865 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833512 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833520 4865 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833527 4865 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833534 4865 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833541 4865 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833547 4865 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833554 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833561 4865 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833568 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833575 4865 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833582 4865 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833589 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833596 4865 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833603 4865 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833611 4865 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833617 4865 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833625 4865 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833632 4865 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833638 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833645 4865 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833651 4865 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833656 4865 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833661 4865 feature_gate.go:330] unrecognized feature gate: Example Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833667 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.833678 4865 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833877 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833886 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833893 4865 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833898 4865 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833904 4865 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833909 4865 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833915 4865 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833920 4865 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833928 4865 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833935 4865 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833941 4865 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833948 4865 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833953 4865 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833959 4865 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833964 4865 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833970 4865 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833976 4865 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833981 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833986 4865 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833991 4865 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.833997 4865 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834002 4865 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834010 4865 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834017 4865 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834024 4865 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834031 4865 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834037 4865 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834043 4865 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834049 4865 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834054 4865 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834059 4865 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834065 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834071 4865 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834077 4865 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834082 4865 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834087 4865 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834092 4865 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834098 4865 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834103 4865 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834108 4865 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834113 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834118 4865 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834123 4865 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834128 4865 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834133 4865 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834138 4865 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834144 4865 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834150 4865 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834157 4865 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834163 4865 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834169 4865 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834174 4865 feature_gate.go:330] unrecognized feature gate: Example Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834180 4865 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834186 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834191 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834197 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834202 4865 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834208 4865 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834213 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834218 4865 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834224 4865 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834229 4865 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834234 4865 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834239 4865 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834244 4865 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834251 4865 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834257 4865 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834263 4865 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834268 4865 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834273 4865 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.834278 4865 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.834287 4865 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.834462 4865 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.837640 4865 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.837748 4865 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.838384 4865 server.go:997] "Starting client certificate rotation" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.838418 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.838545 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-13 15:32:37.06430072 +0000 UTC Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.838597 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.843238 4865 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.844837 4865 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.845185 4865 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.850803 4865 log.go:25] "Validated CRI v1 runtime API" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.862026 4865 log.go:25] "Validated CRI v1 image API" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.862980 4865 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.864679 4865 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-05-47-38-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.864722 4865 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.877380 4865 manager.go:217] Machine: {Timestamp:2025-12-05 05:53:00.876756838 +0000 UTC m=+0.156768070 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2604d388-53ca-45fa-bc89-ff5d1e3eaa2e BootID:7a4ce43a-48c4-42f5-916f-3bf20b87a069 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:32:b4:93 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:32:b4:93 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d3:2d:43 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:16:b4:a6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5c:e9:e8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ee:72:0d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:06:87:23:4c:c7 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:c9:00:94:4f:42 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.877577 4865 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.877711 4865 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.878395 4865 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.878568 4865 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.878597 4865 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.878897 4865 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.878907 4865 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.879120 4865 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.879160 4865 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.879366 4865 state_mem.go:36] "Initialized new in-memory state store" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.879453 4865 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.881107 4865 kubelet.go:418] "Attempting to sync node with API server" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.881136 4865 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.881172 4865 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.881191 4865 kubelet.go:324] "Adding apiserver pod source" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.881206 4865 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.883015 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.883093 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.883317 4865 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.883914 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.883965 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.883994 4865 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.884953 4865 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885750 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885776 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885785 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885795 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885810 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885820 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885853 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885879 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885892 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885902 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885916 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.885926 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.886121 4865 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.886683 4865 server.go:1280] "Started kubelet" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.886859 4865 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.887244 4865 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.887255 4865 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.888038 4865 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.888421 4865 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.888507 4865 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.889158 4865 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.889177 4865 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.889218 4865 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.889150 4865 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 09:24:53.194124591 +0000 UTC Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.889271 4865 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 915h31m52.304860231s for next certificate rotation Dec 05 05:53:00 crc kubenswrapper[4865]: W1205 05:53:00.889759 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.889817 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.889920 4865 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 05:53:00 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.889098 4865 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.147:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e3bea29de73a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 05:53:00.886643619 +0000 UTC m=+0.166654851,LastTimestamp:2025-12-05 05:53:00.886643619 +0000 UTC m=+0.166654851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.890284 4865 factory.go:55] Registering systemd factory Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.890298 4865 factory.go:221] Registration of the systemd container factory successfully Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.890564 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="200ms" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.891458 4865 server.go:460] "Adding debug handlers to kubelet server" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.893008 4865 factory.go:153] Registering CRI-O factory Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.893041 4865 factory.go:221] Registration of the crio container factory successfully Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.893134 4865 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.893163 4865 factory.go:103] Registering Raw factory Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.893184 4865 manager.go:1196] Started watching for new ooms in manager Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.894697 4865 manager.go:319] Starting recovery of all containers Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904343 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904724 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904773 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904934 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904956 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904970 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.904989 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.905002 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.905044 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.905989 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906017 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906035 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906049 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906066 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906077 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906089 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906101 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906113 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906123 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906134 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906146 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906157 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906167 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906179 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906189 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906199 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906213 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906225 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906236 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906248 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906261 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906271 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906287 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906297 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906322 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906339 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906352 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906363 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906374 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906383 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906392 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906403 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906413 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906423 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906433 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906443 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906452 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906467 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906477 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906487 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906498 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906508 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906524 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906535 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906545 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906556 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906566 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906578 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906609 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.906620 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907237 4865 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907257 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907268 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907281 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907291 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907301 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907313 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907326 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907338 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907352 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907362 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907373 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907383 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907393 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907402 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907410 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907420 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907429 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907439 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907452 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907461 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907471 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907480 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907489 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907500 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907510 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907519 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907528 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907538 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907549 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907561 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907570 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907579 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907590 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907599 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907610 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907618 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907628 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907637 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907645 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907659 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907668 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907678 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907694 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907704 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907718 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907729 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907738 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907749 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907760 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907770 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907782 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907792 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907802 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907812 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907836 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907845 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907853 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907862 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907872 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907880 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907889 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907897 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907906 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907916 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907925 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907934 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907947 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907957 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907966 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907978 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907987 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.907996 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908005 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908015 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908024 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908033 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908042 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908051 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908061 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908070 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908079 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908089 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908101 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908111 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908120 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908129 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908137 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908145 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908155 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908165 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908174 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908183 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908192 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908202 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908240 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908254 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908266 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908275 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908285 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908294 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908304 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908314 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908326 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908339 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908351 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908363 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908374 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908387 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908400 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908410 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908421 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908430 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908439 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908449 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908458 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908468 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908477 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908485 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908495 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908506 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908515 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908525 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908534 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908543 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908552 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908561 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908570 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908580 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908592 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908603 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908615 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908629 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908638 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908647 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908660 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908670 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908679 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908690 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908699 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908709 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908720 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908731 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908742 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908752 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908763 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908772 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908782 4865 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908790 4865 reconstruct.go:97] "Volume reconstruction finished" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.908798 4865 reconciler.go:26] "Reconciler: start to sync state" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.915254 4865 manager.go:324] Recovery completed Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.926359 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.928409 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.928441 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.928453 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.929143 4865 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.929163 4865 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 05:53:00 crc kubenswrapper[4865]: I1205 05:53:00.929217 4865 state_mem.go:36] "Initialized new in-memory state store" Dec 05 05:53:00 crc kubenswrapper[4865]: E1205 05:53:00.990147 4865 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.001871 4865 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.003942 4865 policy_none.go:49] "None policy: Start" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.004182 4865 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.005150 4865 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.005219 4865 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.005272 4865 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.006009 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.006063 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.006568 4865 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.006594 4865 state_mem.go:35] "Initializing new in-memory state store" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.067715 4865 manager.go:334] "Starting Device Plugin manager" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.067822 4865 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.067847 4865 server.go:79] "Starting device plugin registration server" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.068189 4865 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.068201 4865 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.068448 4865 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.068559 4865 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.068569 4865 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.076748 4865 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.091449 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="400ms" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.105797 4865 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.105904 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.106927 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.106956 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.106964 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.107135 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.107425 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.107464 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108204 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108228 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108251 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108261 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108233 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108312 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108394 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108564 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.108587 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.109925 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.109945 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.109954 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110059 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110077 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110092 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110179 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110334 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110392 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110852 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110887 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.110898 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111051 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111200 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111259 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111538 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111596 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111611 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111729 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111762 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111771 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111946 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.111968 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.112019 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.112047 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.112060 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.113624 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.113649 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.113659 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.168621 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.169510 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.169554 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.169568 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.169594 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.170007 4865 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211338 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211392 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211411 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211425 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211441 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211461 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211476 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211490 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211510 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211525 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211539 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211551 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211564 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211577 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.211591 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313441 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313502 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313528 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313549 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313573 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313592 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313611 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313631 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313651 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313671 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313692 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313711 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313741 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313762 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.313785 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314235 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314309 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314343 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314364 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314386 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314408 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314431 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314451 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314471 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314510 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314530 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314551 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314571 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314597 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.314625 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.370731 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.371964 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.372011 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.372028 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.372056 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.372634 4865 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.449394 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.456199 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.470000 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b30b293cf3d7e1409ce59e05277fd3d8e0924f00505f74ff1a5cbfba640a2914 WatchSource:0}: Error finding container b30b293cf3d7e1409ce59e05277fd3d8e0924f00505f74ff1a5cbfba640a2914: Status 404 returned error can't find the container with id b30b293cf3d7e1409ce59e05277fd3d8e0924f00505f74ff1a5cbfba640a2914 Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.473553 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-5db69abf145921aa1fb739e5f136987c868dd3def59cc2d9381458637c356c54 WatchSource:0}: Error finding container 5db69abf145921aa1fb739e5f136987c868dd3def59cc2d9381458637c356c54: Status 404 returned error can't find the container with id 5db69abf145921aa1fb739e5f136987c868dd3def59cc2d9381458637c356c54 Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.477198 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.492375 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.492789 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="800ms" Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.496068 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-899492a9f71de54861a0f278f663c63596e1d37a9cacd85417d7118c55e54cdb WatchSource:0}: Error finding container 899492a9f71de54861a0f278f663c63596e1d37a9cacd85417d7118c55e54cdb: Status 404 returned error can't find the container with id 899492a9f71de54861a0f278f663c63596e1d37a9cacd85417d7118c55e54cdb Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.500533 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.501471 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9ba9c33fef4e9ee13c7501c1bd8227256ad6a763e9f514e29f0fd1560c158f3b WatchSource:0}: Error finding container 9ba9c33fef4e9ee13c7501c1bd8227256ad6a763e9f514e29f0fd1560c158f3b: Status 404 returned error can't find the container with id 9ba9c33fef4e9ee13c7501c1bd8227256ad6a763e9f514e29f0fd1560c158f3b Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.516008 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f72016304983c80db7d0dee89129d793760b0f7e9805711c59560e7fcf395f0a WatchSource:0}: Error finding container f72016304983c80db7d0dee89129d793760b0f7e9805711c59560e7fcf395f0a: Status 404 returned error can't find the container with id f72016304983c80db7d0dee89129d793760b0f7e9805711c59560e7fcf395f0a Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.773552 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.775187 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.775220 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.775233 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.775258 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.775965 4865 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Dec 05 05:53:01 crc kubenswrapper[4865]: I1205 05:53:01.887658 4865 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:01 crc kubenswrapper[4865]: W1205 05:53:01.916450 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:01 crc kubenswrapper[4865]: E1205 05:53:01.916522 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.009802 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.009957 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5db69abf145921aa1fb739e5f136987c868dd3def59cc2d9381458637c356c54"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.011172 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298" exitCode=0 Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.011203 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.011230 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b30b293cf3d7e1409ce59e05277fd3d8e0924f00505f74ff1a5cbfba640a2914"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.011347 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.014853 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.014885 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.014894 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.015817 4865 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dcf7b278d473622bb29b906401c7c51ec01498e398f9993a54fe8b4014a6f3c2" exitCode=0 Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.015862 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dcf7b278d473622bb29b906401c7c51ec01498e398f9993a54fe8b4014a6f3c2"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.015902 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f72016304983c80db7d0dee89129d793760b0f7e9805711c59560e7fcf395f0a"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.016019 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.016638 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017227 4865 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="59ec2c5c689c82d7cfa38c82e8a62bb37c9b042b85cce77d2d3b2a437cc4427d" exitCode=0 Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017284 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"59ec2c5c689c82d7cfa38c82e8a62bb37c9b042b85cce77d2d3b2a437cc4427d"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017303 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017322 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017332 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017305 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9ba9c33fef4e9ee13c7501c1bd8227256ad6a763e9f514e29f0fd1560c158f3b"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017362 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017458 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017480 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.017489 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.018152 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.018172 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.018179 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.020338 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"67aa4e80cd6452b2b1725f5bfa8eaad24eb07125ee6245b6df0ce75aa96b0635"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.020461 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.020876 4865 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="67aa4e80cd6452b2b1725f5bfa8eaad24eb07125ee6245b6df0ce75aa96b0635" exitCode=0 Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.020915 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"899492a9f71de54861a0f278f663c63596e1d37a9cacd85417d7118c55e54cdb"} Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.021241 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.021270 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.021283 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:02 crc kubenswrapper[4865]: W1205 05:53:02.117730 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:02 crc kubenswrapper[4865]: E1205 05:53:02.117814 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:02 crc kubenswrapper[4865]: W1205 05:53:02.230698 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:02 crc kubenswrapper[4865]: E1205 05:53:02.230773 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:02 crc kubenswrapper[4865]: W1205 05:53:02.241066 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.147:6443: connect: connection refused Dec 05 05:53:02 crc kubenswrapper[4865]: E1205 05:53:02.241128 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.147:6443: connect: connection refused" logger="UnhandledError" Dec 05 05:53:02 crc kubenswrapper[4865]: E1205 05:53:02.294411 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="1.6s" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.577290 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.578505 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.578531 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.578540 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:02 crc kubenswrapper[4865]: I1205 05:53:02.578558 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:02 crc kubenswrapper[4865]: E1205 05:53:02.579137 4865 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.147:6443: connect: connection refused" node="crc" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.027371 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4a3c15dd0c6f3d6189c6e3bceb33f79056dd799903bb7a78ea9e4211b22d4213"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.027414 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"23193eb0dedf5573f2128b2412610a50a30ca11c4cb66a5a1a20789fe32df679"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.027429 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6cdfcf8dfd30d794cfda84b9e55cb274534a9aa08630df5d639835233bda497"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.027514 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.028642 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.028666 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.028675 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.030996 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.031087 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c3c110665bc4baa2c7de39d20e7146d08e4adde3583630662a67eb60a3fcc7bc"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.031122 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"873f148d20c53be8eb8eaab2ceb12db015b6020bb2014f288d1eec4f858f30ca"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.031133 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef72ed83c6c469013e019cc8cfd2fef94b5a732b139335f4872676689764a14d"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.031155 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.032366 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.032390 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.032399 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.035015 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.035051 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.035063 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.035084 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.038318 4865 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="88e2b78d34d1ee646bdc03cd891d8df3518b9958453d3d7d47be18a307ba3608" exitCode=0 Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.038375 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"88e2b78d34d1ee646bdc03cd891d8df3518b9958453d3d7d47be18a307ba3608"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.038466 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.039255 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.039283 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.039292 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.043591 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f1a07621486b2380da8bb4b0f8e67f434ae7ba6b7efb89386f48e3dab18fd40b"} Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.043648 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.044656 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.044684 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.044695 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:03 crc kubenswrapper[4865]: I1205 05:53:03.873997 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.047650 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644"} Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.047793 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.048600 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.048622 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.048630 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.050254 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.050284 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.050589 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4f745bd4aafdbfc4a1ea668d0b5051f7458d24beaf9c1196e962654d2ffc7e07"} Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.050649 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.050969 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.051542 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.051561 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.051571 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.052028 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.052046 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.052055 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.052433 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.052453 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.052463 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.179407 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.180804 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.180853 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.180862 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.180881 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:04 crc kubenswrapper[4865]: I1205 05:53:04.374106 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.054269 4865 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4f745bd4aafdbfc4a1ea668d0b5051f7458d24beaf9c1196e962654d2ffc7e07" exitCode=0 Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.054363 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.054814 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.054404 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4f745bd4aafdbfc4a1ea668d0b5051f7458d24beaf9c1196e962654d2ffc7e07"} Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.054859 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.055649 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.055763 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.055668 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.055799 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.055811 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.055776 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.056805 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.056862 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.056876 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.878070 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:05 crc kubenswrapper[4865]: I1205 05:53:05.883962 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.059807 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2ae4914f32895cdf931bdf529015e8fc109263e01c655fc0e8cb8734f7dae8f3"} Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.059864 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8909267bb8f9e4ee5ea4f2ea5fc61b2c50e84c77fd69e6addcc8cda470a678c4"} Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.059885 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.059930 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.060673 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.060705 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.060714 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.061200 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.061237 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:06 crc kubenswrapper[4865]: I1205 05:53:06.061248 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.002403 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.002604 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.003941 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.003973 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.003981 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.066107 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b086e88db9f5fa3cfde7989f3bd7a61e4de857a1e90be9d6096edd6ff971bb45"} Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.066162 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"68307dabb18e380ac53794376e441659598f33f40f5370689c41817bb2938301"} Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.066179 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f668a4557dedca1d640c1afd88cbeaa87caa90e20dbc58047a9cafdae30e1621"} Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.066183 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.066131 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.066327 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.067184 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.067207 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.067216 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.067233 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.067258 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.067267 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.278023 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.278232 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.279434 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.279490 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.279506 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.685269 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 05:53:07 crc kubenswrapper[4865]: I1205 05:53:07.918074 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.068465 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.069405 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.069437 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.069459 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.802312 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.802469 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.803501 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.803548 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:08 crc kubenswrapper[4865]: I1205 05:53:08.803560 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:09 crc kubenswrapper[4865]: I1205 05:53:09.071001 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:09 crc kubenswrapper[4865]: I1205 05:53:09.072526 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:09 crc kubenswrapper[4865]: I1205 05:53:09.072556 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:09 crc kubenswrapper[4865]: I1205 05:53:09.072571 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:11 crc kubenswrapper[4865]: E1205 05:53:11.076898 4865 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.224663 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.224813 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.224881 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.226142 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.226209 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.226223 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.229059 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:11 crc kubenswrapper[4865]: I1205 05:53:11.751513 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:12 crc kubenswrapper[4865]: I1205 05:53:12.077260 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:12 crc kubenswrapper[4865]: I1205 05:53:12.078189 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:12 crc kubenswrapper[4865]: I1205 05:53:12.078223 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:12 crc kubenswrapper[4865]: I1205 05:53:12.078234 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:12 crc kubenswrapper[4865]: I1205 05:53:12.888496 4865 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 05:53:13 crc kubenswrapper[4865]: E1205 05:53:13.032497 4865 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 05:53:13 crc kubenswrapper[4865]: I1205 05:53:13.079793 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:13 crc kubenswrapper[4865]: I1205 05:53:13.080773 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:13 crc kubenswrapper[4865]: I1205 05:53:13.080814 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:13 crc kubenswrapper[4865]: I1205 05:53:13.080850 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:13 crc kubenswrapper[4865]: W1205 05:53:13.639321 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 05:53:13 crc kubenswrapper[4865]: I1205 05:53:13.639407 4865 trace.go:236] Trace[1864754712]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 05:53:03.637) (total time: 10001ms): Dec 05 05:53:13 crc kubenswrapper[4865]: Trace[1864754712]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:53:13.639) Dec 05 05:53:13 crc kubenswrapper[4865]: Trace[1864754712]: [10.001448817s] [10.001448817s] END Dec 05 05:53:13 crc kubenswrapper[4865]: E1205 05:53:13.639428 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 05:53:13 crc kubenswrapper[4865]: W1205 05:53:13.818627 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 05:53:13 crc kubenswrapper[4865]: I1205 05:53:13.818716 4865 trace.go:236] Trace[2110050171]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 05:53:03.816) (total time: 10001ms): Dec 05 05:53:13 crc kubenswrapper[4865]: Trace[2110050171]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:53:13.818) Dec 05 05:53:13 crc kubenswrapper[4865]: Trace[2110050171]: [10.001749055s] [10.001749055s] END Dec 05 05:53:13 crc kubenswrapper[4865]: E1205 05:53:13.818741 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 05:53:13 crc kubenswrapper[4865]: E1205 05:53:13.895722 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 05:53:14 crc kubenswrapper[4865]: E1205 05:53:14.182071 4865 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 05:53:14 crc kubenswrapper[4865]: I1205 05:53:14.751570 4865 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 05:53:14 crc kubenswrapper[4865]: I1205 05:53:14.751978 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 05:53:14 crc kubenswrapper[4865]: W1205 05:53:14.852003 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 05:53:14 crc kubenswrapper[4865]: I1205 05:53:14.852085 4865 trace.go:236] Trace[443641134]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 05:53:04.850) (total time: 10001ms): Dec 05 05:53:14 crc kubenswrapper[4865]: Trace[443641134]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:53:14.851) Dec 05 05:53:14 crc kubenswrapper[4865]: Trace[443641134]: [10.001793845s] [10.001793845s] END Dec 05 05:53:14 crc kubenswrapper[4865]: E1205 05:53:14.852106 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 05:53:15 crc kubenswrapper[4865]: W1205 05:53:15.376371 4865 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 05:53:15 crc kubenswrapper[4865]: I1205 05:53:15.376444 4865 trace.go:236] Trace[581336292]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 05:53:05.375) (total time: 10000ms): Dec 05 05:53:15 crc kubenswrapper[4865]: Trace[581336292]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (05:53:15.376) Dec 05 05:53:15 crc kubenswrapper[4865]: Trace[581336292]: [10.000667543s] [10.000667543s] END Dec 05 05:53:15 crc kubenswrapper[4865]: E1205 05:53:15.376470 4865 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 05:53:16 crc kubenswrapper[4865]: E1205 05:53:16.881485 4865 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187e3bea29de73a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 05:53:00.886643619 +0000 UTC m=+0.166654851,LastTimestamp:2025-12-05 05:53:00.886643619 +0000 UTC m=+0.166654851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.192189 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.278952 4865 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.279034 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.382915 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.384470 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.384702 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.384795 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.384912 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.942173 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.942300 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.943452 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.943487 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.943499 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:17 crc kubenswrapper[4865]: I1205 05:53:17.954191 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 05:53:18 crc kubenswrapper[4865]: I1205 05:53:18.095537 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:18 crc kubenswrapper[4865]: I1205 05:53:18.096314 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:18 crc kubenswrapper[4865]: I1205 05:53:18.096358 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:18 crc kubenswrapper[4865]: I1205 05:53:18.096369 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:18 crc kubenswrapper[4865]: I1205 05:53:18.468966 4865 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 05:53:18 crc kubenswrapper[4865]: I1205 05:53:18.469048 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 05:53:21 crc kubenswrapper[4865]: E1205 05:53:21.076960 4865 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 05:53:22 crc kubenswrapper[4865]: I1205 05:53:22.285596 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:22 crc kubenswrapper[4865]: I1205 05:53:22.285973 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:22 crc kubenswrapper[4865]: I1205 05:53:22.287669 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:22 crc kubenswrapper[4865]: I1205 05:53:22.287809 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:22 crc kubenswrapper[4865]: I1205 05:53:22.287940 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:22 crc kubenswrapper[4865]: I1205 05:53:22.292325 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.107084 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.107134 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.108034 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.108075 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.108085 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.475583 4865 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 05:53:23 crc kubenswrapper[4865]: E1205 05:53:23.477411 4865 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.477979 4865 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.487384 4865 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.543793 4865 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42272->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.543863 4865 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42274->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.543881 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42272->192.168.126.11:17697: read: connection reset by peer" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.543921 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42274->192.168.126.11:17697: read: connection reset by peer" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.544151 4865 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.544170 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.680695 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.680882 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.680934 4865 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.681989 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.682039 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.682054 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.685930 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.885600 4865 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.894195 4865 apiserver.go:52] "Watching apiserver" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900203 4865 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900444 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900754 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900839 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900935 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900937 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.900968 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:23 crc kubenswrapper[4865]: E1205 05:53:23.901159 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:23 crc kubenswrapper[4865]: E1205 05:53:23.901267 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.901343 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:23 crc kubenswrapper[4865]: E1205 05:53:23.901470 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.907432 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.907886 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.908009 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.908138 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.908398 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.908589 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.908759 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.911981 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.912211 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.956600 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.975005 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.988341 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:23 crc kubenswrapper[4865]: I1205 05:53:23.989670 4865 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.001696 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.011115 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.021663 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.031081 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086500 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086546 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086573 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086597 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086617 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086951 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.086981 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087004 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087033 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087052 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.087139 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:24.587119751 +0000 UTC m=+23.867130973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087167 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087379 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087541 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087571 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087576 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087908 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087926 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.087976 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088054 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088078 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088113 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088145 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088199 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088419 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088577 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088630 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088648 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088662 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088817 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088848 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088877 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.088896 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089035 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089147 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089192 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089767 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089903 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089941 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089964 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090154 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090386 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.089978 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090440 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090457 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090462 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090472 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090509 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090533 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091017 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091034 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090692 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090761 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090853 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090964 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.090974 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091239 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091281 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091287 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091302 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091320 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091338 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091360 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091363 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091393 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091867 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091893 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091892 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091943 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.091967 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092086 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092244 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092318 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092348 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092372 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092392 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092321 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092416 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092391 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092441 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092472 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092494 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092514 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092565 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092585 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092605 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092630 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092651 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092675 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092699 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092720 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092741 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092762 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092783 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092805 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092847 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092869 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092444 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092552 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093867 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093893 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092628 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092690 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092799 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092932 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.092903 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094082 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094254 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094285 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094359 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093158 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093157 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093195 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093311 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093374 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093449 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093475 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093585 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093597 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093680 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093809 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094459 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094491 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094514 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094535 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094553 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094569 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.093887 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094879 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094729 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094931 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095094 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095184 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095221 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095276 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.094583 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095295 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095318 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095463 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095523 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095612 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095664 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095775 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095818 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095865 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095888 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095908 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095928 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095950 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095971 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.095991 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096011 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096031 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096051 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096072 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096091 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096109 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096130 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096149 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096170 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096194 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.096214 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101046 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101082 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101106 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101129 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101152 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101176 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101198 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101216 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101238 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101260 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101282 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101302 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101318 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101333 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101353 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101375 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101400 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101419 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101441 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101461 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101484 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101528 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101549 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101763 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101790 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101810 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101897 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101916 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101936 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101960 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.101985 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102005 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102028 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102051 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102074 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102097 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102118 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102139 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102162 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102183 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102205 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102226 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102248 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102270 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102290 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102311 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102334 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102356 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102376 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102400 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102423 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102443 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102467 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102487 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102509 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102530 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102554 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102575 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102595 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102618 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102642 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102663 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102686 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102707 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102730 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102753 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102776 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102989 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103031 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103051 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103071 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103091 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103108 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103136 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103155 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103172 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103191 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103206 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103221 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103236 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103252 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103267 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103283 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103301 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103317 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103334 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103351 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103371 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103388 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103403 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103419 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103438 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103455 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103472 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103491 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103509 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103525 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103542 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103749 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103766 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103785 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103803 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103869 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103891 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103908 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103948 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103967 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103987 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104005 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104021 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104038 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104059 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104078 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104098 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104115 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104135 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104157 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104180 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104201 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104260 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104271 4865 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104280 4865 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104289 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104298 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104308 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104317 4865 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104326 4865 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104336 4865 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104345 4865 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104354 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104364 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104373 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104383 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104393 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104402 4865 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104411 4865 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104422 4865 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104431 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104441 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104449 4865 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104459 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104468 4865 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104476 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104485 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104494 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104503 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104513 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104522 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104531 4865 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104539 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104549 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104588 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104598 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104609 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104618 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104629 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104638 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104647 4865 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104657 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104666 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104675 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104684 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104693 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104702 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104710 4865 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104719 4865 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104728 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104738 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104748 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104757 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104765 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104774 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104783 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104793 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104802 4865 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104811 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104853 4865 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104863 4865 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104873 4865 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104883 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104892 4865 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104909 4865 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104918 4865 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104927 4865 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104936 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104947 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104990 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105001 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105010 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105021 4865 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.102997 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103184 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103516 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103487 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.103763 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105860 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104018 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104107 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104171 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104383 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104389 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104559 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104899 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104905 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.104913 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105173 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105210 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105244 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105454 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105586 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105628 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105677 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.105732 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106078 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106319 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106327 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106261 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106637 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106661 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106693 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106882 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107011 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107020 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107113 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107372 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107410 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107580 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107867 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.106590 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.107892 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108096 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108366 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108424 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108604 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108848 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108639 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.108987 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109156 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109188 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109287 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109421 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109434 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109460 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109602 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109633 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109676 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109700 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109816 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109987 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.109997 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.110071 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.111416 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.110178 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.110235 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.110645 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.110694 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.111071 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.111148 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.111254 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.112368 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.112372 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.112648 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.112694 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.112951 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113016 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113040 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113325 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113334 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113546 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113458 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113733 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.113760 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.114626 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.114881 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.115123 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.115459 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.116471 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.117113 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.117458 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.118673 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.118959 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.121619 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.122066 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.122783 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.122854 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.122914 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.123385 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.124726 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.124953 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.125271 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.125363 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126334 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126382 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126727 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126842 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126846 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126864 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.127058 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.126880 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.128593 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.128692 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.128192 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.129269 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644" exitCode=255 Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.129689 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.130047 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.130154 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.130215 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644"} Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.130346 4865 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.130843 4865 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.130855 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.130909 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:24.630892368 +0000 UTC m=+23.910903670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.131078 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.133017 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.139914 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.140257 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.140679 4865 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.140744 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:24.640731198 +0000 UTC m=+23.920742420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.140943 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.145421 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.145994 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.152003 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.156135 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.156556 4865 scope.go:117] "RemoveContainer" containerID="6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.156595 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.156624 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.162228 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.166873 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.168266 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.175128 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.175692 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.186469 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.186501 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.186513 4865 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.186567 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:24.686549422 +0000 UTC m=+23.966560634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.186662 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.189649 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.199235 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.199274 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.199288 4865 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.199344 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:24.699323746 +0000 UTC m=+23.979334968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205715 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205806 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205875 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205891 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205903 4865 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205916 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205929 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205942 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205954 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205963 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205974 4865 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205987 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.205998 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206010 4865 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206021 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206033 4865 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206045 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206056 4865 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206068 4865 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206080 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206092 4865 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206108 4865 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206121 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206134 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206147 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206159 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206170 4865 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206181 4865 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206192 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206202 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206213 4865 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206226 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206237 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206251 4865 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206262 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206274 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206285 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206297 4865 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206308 4865 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206321 4865 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206334 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206344 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206356 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206366 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206379 4865 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206390 4865 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206401 4865 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206413 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206425 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206436 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206447 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206460 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206472 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206484 4865 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206497 4865 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206509 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206519 4865 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206530 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206541 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206552 4865 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206563 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206575 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206587 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206598 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206608 4865 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206618 4865 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206630 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206641 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206652 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206666 4865 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206676 4865 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206688 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206699 4865 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206711 4865 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206721 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206733 4865 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206744 4865 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206755 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206766 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206777 4865 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206791 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206803 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206814 4865 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206846 4865 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206857 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206867 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206878 4865 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206889 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206900 4865 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206911 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206923 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206934 4865 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206946 4865 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206957 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206957 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.206968 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207023 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207041 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207055 4865 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207072 4865 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207082 4865 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207093 4865 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207104 4865 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207114 4865 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207124 4865 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207135 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207145 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207156 4865 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207166 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207177 4865 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207187 4865 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207199 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207209 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207219 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207229 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207240 4865 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207251 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207261 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207271 4865 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207281 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207291 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207301 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207312 4865 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207322 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207333 4865 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207344 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.207354 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.227079 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.236954 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.238511 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.255006 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.268044 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.278318 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: W1205 05:53:24.289767 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-27c4ff114a54a963b60700f0671acd770834d5479c75b476a398c314ecf612e2 WatchSource:0}: Error finding container 27c4ff114a54a963b60700f0671acd770834d5479c75b476a398c314ecf612e2: Status 404 returned error can't find the container with id 27c4ff114a54a963b60700f0671acd770834d5479c75b476a398c314ecf612e2 Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.318996 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.341126 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.374106 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.456052 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.525007 4865 csr.go:261] certificate signing request csr-pz5w6 is approved, waiting to be issued Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.591076 4865 csr.go:257] certificate signing request csr-pz5w6 is issued Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.610624 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.610880 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:25.610840011 +0000 UTC m=+24.890851233 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.678097 4865 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.711575 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.711630 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.711662 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:24 crc kubenswrapper[4865]: I1205 05:53:24.711691 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.711774 4865 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.711853 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:25.711816616 +0000 UTC m=+24.991827838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.711954 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.711970 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.711985 4865 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712017 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:25.712008082 +0000 UTC m=+24.992019304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712071 4865 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712097 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:25.712090134 +0000 UTC m=+24.992101356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712159 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712171 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712180 4865 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:24 crc kubenswrapper[4865]: E1205 05:53:24.712205 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:25.712197817 +0000 UTC m=+24.992209039 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.005767 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.005969 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.010011 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.010546 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.057757 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.084135 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.084839 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.085385 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.086093 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.086658 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.087329 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.087923 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.088472 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.089366 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.089941 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.090512 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.091107 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.091691 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.095000 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.095435 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.096666 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.097331 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.097843 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.101964 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.102508 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.103723 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.104190 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.106732 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.108061 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.108602 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.109398 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.110436 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.110999 4865 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.111117 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.114313 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.114877 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.115363 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.119395 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.120255 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.120945 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.122105 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.123917 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.124432 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.125179 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.127423 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.128541 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.129085 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.130230 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.133489 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.135538 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.135997 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.136149 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.137653 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.138391 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.142001 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.142725 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.143260 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.144216 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9cxx2"] Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.144431 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867"} Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.144460 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9b4df5a56337b0823b6583ee3eaf1209dc07210c18f9762d768aeea754cc3e88"} Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.144473 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"27c4ff114a54a963b60700f0671acd770834d5479c75b476a398c314ecf612e2"} Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.144483 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5cb791773057ede6eb6d934782bfe7ceebda6e50b1491acfa4eac3aeebb45ff4"} Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.144570 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.147151 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.147558 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.147591 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.163683 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9cxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeace6ed-0114-43b9-a3cb-1eac28798d15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfnmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9cxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.194648 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f189b188-8dd3-4ac5-88ee-8736384c0427\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef72ed83c6c469013e019cc8cfd2fef94b5a732b139335f4872676689764a14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://873f148d20c53be8eb8eaab2ceb12db015b6020bb2014f288d1eec4f858f30ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3c110665bc4baa2c7de39d20e7146d08e4adde3583630662a67eb60a3fcc7bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.209443 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.215635 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eeace6ed-0114-43b9-a3cb-1eac28798d15-hosts-file\") pod \"node-resolver-9cxx2\" (UID: \"eeace6ed-0114-43b9-a3cb-1eac28798d15\") " pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.215886 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfnmk\" (UniqueName: \"kubernetes.io/projected/eeace6ed-0114-43b9-a3cb-1eac28798d15-kube-api-access-dfnmk\") pod \"node-resolver-9cxx2\" (UID: \"eeace6ed-0114-43b9-a3cb-1eac28798d15\") " pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.245601 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.286901 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.301599 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.317156 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eeace6ed-0114-43b9-a3cb-1eac28798d15-hosts-file\") pod \"node-resolver-9cxx2\" (UID: \"eeace6ed-0114-43b9-a3cb-1eac28798d15\") " pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.317424 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfnmk\" (UniqueName: \"kubernetes.io/projected/eeace6ed-0114-43b9-a3cb-1eac28798d15-kube-api-access-dfnmk\") pod \"node-resolver-9cxx2\" (UID: \"eeace6ed-0114-43b9-a3cb-1eac28798d15\") " pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.317312 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eeace6ed-0114-43b9-a3cb-1eac28798d15-hosts-file\") pod \"node-resolver-9cxx2\" (UID: \"eeace6ed-0114-43b9-a3cb-1eac28798d15\") " pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.349385 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed818037-beb4-4918-a648-c51549a1b8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 05:53:14.210312 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 05:53:14.211612 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3722328178/tls.crt::/tmp/serving-cert-3722328178/tls.key\\\\\\\"\\\\nI1205 05:53:23.489235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 05:53:23.507619 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 05:53:23.507648 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 05:53:23.507669 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 05:53:23.507675 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 05:53:23.530145 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 05:53:23.530185 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530191 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 05:53:23.530200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 05:53:23.530205 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 05:53:23.530210 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 05:53:23.530546 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 05:53:23.537922 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.366736 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfnmk\" (UniqueName: \"kubernetes.io/projected/eeace6ed-0114-43b9-a3cb-1eac28798d15-kube-api-access-dfnmk\") pod \"node-resolver-9cxx2\" (UID: \"eeace6ed-0114-43b9-a3cb-1eac28798d15\") " pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.383177 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.412851 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.558141 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9cxx2" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.592602 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-05 05:48:24 +0000 UTC, rotation deadline is 2026-09-07 11:52:58.629883356 +0000 UTC Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.592656 4865 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6629h59m33.037229924s for next certificate rotation Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.620399 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.620543 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:27.620527826 +0000 UTC m=+26.900539048 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.650290 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bpkm9"] Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.650635 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.909651 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.910088 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.910148 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.910385 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911152 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911604 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-os-release\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911638 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-cni-bin\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911660 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz2xt\" (UniqueName: \"kubernetes.io/projected/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-kube-api-access-qz2xt\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911685 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-cnibin\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911708 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-daemon-config\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911729 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-multus-certs\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911747 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-cni-binary-copy\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911772 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-cni-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911793 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-socket-dir-parent\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911812 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-k8s-cni-cncf-io\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911847 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-hostroot\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911875 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911898 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911924 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911945 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-cni-multus\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911968 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.911991 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-netns\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912094 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912109 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912145 4865 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912212 4865 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912253 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:27.912240581 +0000 UTC m=+27.192251803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912315 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912326 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912337 4865 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912375 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:27.912367614 +0000 UTC m=+27.192378836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.912408 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-kubelet\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912414 4865 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.912425 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-conf-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912489 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:27.912473477 +0000 UTC m=+27.192484699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.912512 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-etc-kubernetes\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: E1205 05:53:25.912551 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:27.912543879 +0000 UTC m=+27.192555101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.912568 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-system-cni-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.935188 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.962776 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:25 crc kubenswrapper[4865]: I1205 05:53:25.983214 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9cxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeace6ed-0114-43b9-a3cb-1eac28798d15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfnmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9cxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.004750 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f189b188-8dd3-4ac5-88ee-8736384c0427\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef72ed83c6c469013e019cc8cfd2fef94b5a732b139335f4872676689764a14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://873f148d20c53be8eb8eaab2ceb12db015b6020bb2014f288d1eec4f858f30ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3c110665bc4baa2c7de39d20e7146d08e4adde3583630662a67eb60a3fcc7bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.005639 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:26 crc kubenswrapper[4865]: E1205 05:53:26.005781 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.005843 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:26 crc kubenswrapper[4865]: E1205 05:53:26.005883 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013028 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-cni-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013064 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-cni-binary-copy\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013082 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-socket-dir-parent\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013097 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-k8s-cni-cncf-io\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013112 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-hostroot\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013149 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-cni-multus\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013170 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-system-cni-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-k8s-cni-cncf-io\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013190 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-netns\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013228 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-netns\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013236 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-kubelet\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013246 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-cni-multus\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013248 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-hostroot\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013269 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-kubelet\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013255 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-conf-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013359 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-etc-kubernetes\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013390 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-etc-kubernetes\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013358 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-system-cni-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013272 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-conf-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013399 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-os-release\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013457 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-cni-bin\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013480 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz2xt\" (UniqueName: \"kubernetes.io/projected/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-kube-api-access-qz2xt\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013500 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-cnibin\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013507 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-var-lib-cni-bin\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013516 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-daemon-config\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013536 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-multus-certs\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013565 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-cnibin\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013577 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-os-release\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013583 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-host-run-multus-certs\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013622 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-cni-dir\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.013619 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-socket-dir-parent\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.014497 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-cni-binary-copy\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.014533 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-multus-daemon-config\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.025412 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.046479 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz2xt\" (UniqueName: \"kubernetes.io/projected/2d1a82bf-1dc7-48e4-b2e2-32514537aae7-kube-api-access-qz2xt\") pod \"multus-bpkm9\" (UID: \"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\") " pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.106670 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.109110 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g5k4k"] Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.109795 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.110391 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hhx2r"] Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.110630 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113794 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-script-lib\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113843 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113874 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-systemd\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113890 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-var-lib-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113909 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx4n2\" (UniqueName: \"kubernetes.io/projected/e740ad4f-4c03-467b-8f0f-4fec2493d426-kube-api-access-zx4n2\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113924 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1356a0a-4e64-49b5-b640-3779d3abe333-rootfs\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113938 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-kubelet\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.113955 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1356a0a-4e64-49b5-b640-3779d3abe333-mcd-auth-proxy-config\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114034 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pgc7\" (UniqueName: \"kubernetes.io/projected/c1356a0a-4e64-49b5-b640-3779d3abe333-kube-api-access-6pgc7\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114086 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114106 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-ovn\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114130 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-slash\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114145 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-env-overrides\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114216 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-config\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114256 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-systemd-units\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114277 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-log-socket\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114296 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovn-node-metrics-cert\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114322 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-bin\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114344 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-netd\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114384 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-etc-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114405 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1356a0a-4e64-49b5-b640-3779d3abe333-proxy-tls\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114424 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-netns\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114443 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-node-log\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.114475 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-ovn-kubernetes\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.120474 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.121652 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kmtnk"] Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.121777 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.122243 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.128928 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.129230 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.129363 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.139233 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.143509 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9cb18bd7a5b7d0079fde8b898adc71a40e1e9d6e96e6088cc5d2980bbb9b6aeb"} Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.144444 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9cxx2" event={"ID":"eeace6ed-0114-43b9-a3cb-1eac28798d15","Type":"ContainerStarted","Data":"d8e762688c8729230d8bed7e678ff906a844c83e658b2717e39cf0463f3d55ae"} Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.144470 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9cxx2" event={"ID":"eeace6ed-0114-43b9-a3cb-1eac28798d15","Type":"ContainerStarted","Data":"b2cd87d9223b7c14c9ccaedeb0e28caf3ff18f5f913d1096cf568a10f2aadb93"} Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.146188 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fab26569035f876c8d1d404dbbf5a800acd38d95a6b2bd3de928879786db2812"} Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.146214 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156152 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156228 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156272 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156405 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156478 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156491 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156566 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.156639 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.166806 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bpkm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz2xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bpkm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.193603 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed818037-beb4-4918-a648-c51549a1b8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 05:53:14.210312 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 05:53:14.211612 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3722328178/tls.crt::/tmp/serving-cert-3722328178/tls.key\\\\\\\"\\\\nI1205 05:53:23.489235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 05:53:23.507619 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 05:53:23.507648 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 05:53:23.507669 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 05:53:23.507675 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 05:53:23.530145 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 05:53:23.530185 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530191 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 05:53:23.530200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 05:53:23.530205 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 05:53:23.530210 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 05:53:23.530546 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 05:53:23.537922 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215196 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovn-node-metrics-cert\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215234 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-systemd-units\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215251 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-log-socket\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215267 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-netd\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215291 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ceda1986-2884-4c10-b39b-0ab350e56ce0-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215308 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fthfs\" (UniqueName: \"kubernetes.io/projected/ceda1986-2884-4c10-b39b-0ab350e56ce0-kube-api-access-fthfs\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215332 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-bin\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215348 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-etc-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215362 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1356a0a-4e64-49b5-b640-3779d3abe333-proxy-tls\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215379 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-netns\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215395 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-node-log\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215410 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-cnibin\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215427 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-ovn-kubernetes\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215458 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215492 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-script-lib\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215508 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-systemd\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215534 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-var-lib-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215566 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx4n2\" (UniqueName: \"kubernetes.io/projected/e740ad4f-4c03-467b-8f0f-4fec2493d426-kube-api-access-zx4n2\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215581 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-os-release\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215596 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1356a0a-4e64-49b5-b640-3779d3abe333-rootfs\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215609 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-kubelet\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215625 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pgc7\" (UniqueName: \"kubernetes.io/projected/c1356a0a-4e64-49b5-b640-3779d3abe333-kube-api-access-6pgc7\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215648 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1356a0a-4e64-49b5-b640-3779d3abe333-mcd-auth-proxy-config\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215664 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215679 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-ovn\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215693 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-slash\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215707 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-env-overrides\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215721 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-system-cni-dir\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215737 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215751 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-config\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.215764 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ceda1986-2884-4c10-b39b-0ab350e56ce0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.216240 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-systemd\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.216275 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-systemd-units\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.216297 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-netns\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.216320 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-node-log\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.216911 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-bin\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.216998 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-ovn-kubernetes\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.217263 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-etc-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.217287 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1356a0a-4e64-49b5-b640-3779d3abe333-rootfs\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.217305 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-var-lib-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.217406 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218055 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-log-socket\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218096 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-ovn\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218119 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-openvswitch\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218138 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-script-lib\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218143 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-netd\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218159 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-kubelet\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218175 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-slash\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.218305 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bpkm9" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.219994 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1356a0a-4e64-49b5-b640-3779d3abe333-proxy-tls\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.222861 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1356a0a-4e64-49b5-b640-3779d3abe333-mcd-auth-proxy-config\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.226392 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.227048 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-env-overrides\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.227450 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovn-node-metrics-cert\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.227916 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-config\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: W1205 05:53:26.233502 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d1a82bf_1dc7_48e4_b2e2_32514537aae7.slice/crio-221c51eff964d5104e4fb4b9133a514e64d5e9db3ba006bac22b8cc86a6b1346 WatchSource:0}: Error finding container 221c51eff964d5104e4fb4b9133a514e64d5e9db3ba006bac22b8cc86a6b1346: Status 404 returned error can't find the container with id 221c51eff964d5104e4fb4b9133a514e64d5e9db3ba006bac22b8cc86a6b1346 Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.259511 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pgc7\" (UniqueName: \"kubernetes.io/projected/c1356a0a-4e64-49b5-b640-3779d3abe333-kube-api-access-6pgc7\") pod \"machine-config-daemon-hhx2r\" (UID: \"c1356a0a-4e64-49b5-b640-3779d3abe333\") " pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.259951 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx4n2\" (UniqueName: \"kubernetes.io/projected/e740ad4f-4c03-467b-8f0f-4fec2493d426-kube-api-access-zx4n2\") pod \"ovnkube-node-g5k4k\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.274066 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.292637 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed818037-beb4-4918-a648-c51549a1b8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 05:53:14.210312 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 05:53:14.211612 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3722328178/tls.crt::/tmp/serving-cert-3722328178/tls.key\\\\\\\"\\\\nI1205 05:53:23.489235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 05:53:23.507619 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 05:53:23.507648 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 05:53:23.507669 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 05:53:23.507675 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 05:53:23.530145 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 05:53:23.530185 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530191 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 05:53:23.530200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 05:53:23.530205 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 05:53:23.530210 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 05:53:23.530546 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 05:53:23.537922 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.309469 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317038 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-os-release\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317091 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-system-cni-dir\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317113 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317133 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ceda1986-2884-4c10-b39b-0ab350e56ce0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317156 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ceda1986-2884-4c10-b39b-0ab350e56ce0-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317174 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fthfs\" (UniqueName: \"kubernetes.io/projected/ceda1986-2884-4c10-b39b-0ab350e56ce0-kube-api-access-fthfs\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317210 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-cnibin\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317259 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317271 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-cnibin\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317940 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ceda1986-2884-4c10-b39b-0ab350e56ce0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.317992 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ceda1986-2884-4c10-b39b-0ab350e56ce0-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.318020 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-os-release\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.318044 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ceda1986-2884-4c10-b39b-0ab350e56ce0-system-cni-dir\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.335710 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fthfs\" (UniqueName: \"kubernetes.io/projected/ceda1986-2884-4c10-b39b-0ab350e56ce0-kube-api-access-fthfs\") pod \"multus-additional-cni-plugins-kmtnk\" (UID: \"ceda1986-2884-4c10-b39b-0ab350e56ce0\") " pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.338812 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e740ad4f-4c03-467b-8f0f-4fec2493d426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g5k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.354490 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.368766 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.382245 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9cxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeace6ed-0114-43b9-a3cb-1eac28798d15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfnmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9cxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.399301 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceda1986-2884-4c10-b39b-0ab350e56ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kmtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.414644 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.421173 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.428680 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1356a0a-4e64-49b5-b640-3779d3abe333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pgc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pgc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hhx2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.428744 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:26 crc kubenswrapper[4865]: W1205 05:53:26.431523 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1356a0a_4e64_49b5_b640_3779d3abe333.slice/crio-f8591504228ce0178855942c181d459c96c801d49c999fdbfdb111e82d7d5a4f WatchSource:0}: Error finding container f8591504228ce0178855942c181d459c96c801d49c999fdbfdb111e82d7d5a4f: Status 404 returned error can't find the container with id f8591504228ce0178855942c181d459c96c801d49c999fdbfdb111e82d7d5a4f Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.439561 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" Dec 05 05:53:26 crc kubenswrapper[4865]: W1205 05:53:26.441601 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode740ad4f_4c03_467b_8f0f_4fec2493d426.slice/crio-25d34272017cc6e15cc9923323dccd32c76235fdd2d1a02011dea3e46571dcfa WatchSource:0}: Error finding container 25d34272017cc6e15cc9923323dccd32c76235fdd2d1a02011dea3e46571dcfa: Status 404 returned error can't find the container with id 25d34272017cc6e15cc9923323dccd32c76235fdd2d1a02011dea3e46571dcfa Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.444641 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f189b188-8dd3-4ac5-88ee-8736384c0427\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef72ed83c6c469013e019cc8cfd2fef94b5a732b139335f4872676689764a14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://873f148d20c53be8eb8eaab2ceb12db015b6020bb2014f288d1eec4f858f30ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3c110665bc4baa2c7de39d20e7146d08e4adde3583630662a67eb60a3fcc7bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: W1205 05:53:26.465354 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceda1986_2884_4c10_b39b_0ab350e56ce0.slice/crio-ce6fc3f056fc9025dd2caa584576a53510b138c354826290c0250a0eb6ba5a48 WatchSource:0}: Error finding container ce6fc3f056fc9025dd2caa584576a53510b138c354826290c0250a0eb6ba5a48: Status 404 returned error can't find the container with id ce6fc3f056fc9025dd2caa584576a53510b138c354826290c0250a0eb6ba5a48 Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.471320 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab26569035f876c8d1d404dbbf5a800acd38d95a6b2bd3de928879786db2812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.490866 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:26 crc kubenswrapper[4865]: I1205 05:53:26.510007 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bpkm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz2xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bpkm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.005955 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.006401 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.162186 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" exitCode=0 Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.162263 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.162288 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"25d34272017cc6e15cc9923323dccd32c76235fdd2d1a02011dea3e46571dcfa"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.166308 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"1535e6216109d3a961855589ba66dbf44c9d14e9f9c0ad646808734c8d44a71c"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.166344 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.166357 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"f8591504228ce0178855942c181d459c96c801d49c999fdbfdb111e82d7d5a4f"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.174137 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bpkm9" event={"ID":"2d1a82bf-1dc7-48e4-b2e2-32514537aae7","Type":"ContainerStarted","Data":"26b35f7cbeb5d671a1296bf1f0c1cd2685ea23b78ab0270bffa3b7e547e85b98"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.174193 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bpkm9" event={"ID":"2d1a82bf-1dc7-48e4-b2e2-32514537aae7","Type":"ContainerStarted","Data":"221c51eff964d5104e4fb4b9133a514e64d5e9db3ba006bac22b8cc86a6b1346"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.175562 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ba2b808ed9f1fba846e84a5d5cdf9f8c8b8f263350fcf0d04419e9d4d6c8e706"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.177124 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerStarted","Data":"ea82a91d83c7818c6322b8ec5e0d1627988980a252f067edb5ce6afd9397ee2a"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.177159 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerStarted","Data":"ce6fc3f056fc9025dd2caa584576a53510b138c354826290c0250a0eb6ba5a48"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.179075 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"343b34f466580bd7634e4c9a1a20aca0000a609817db41ae527dc89b3b476385"} Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.187339 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.205925 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1356a0a-4e64-49b5-b640-3779d3abe333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pgc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pgc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hhx2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.224575 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f189b188-8dd3-4ac5-88ee-8736384c0427\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef72ed83c6c469013e019cc8cfd2fef94b5a732b139335f4872676689764a14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://873f148d20c53be8eb8eaab2ceb12db015b6020bb2014f288d1eec4f858f30ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3c110665bc4baa2c7de39d20e7146d08e4adde3583630662a67eb60a3fcc7bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.244515 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab26569035f876c8d1d404dbbf5a800acd38d95a6b2bd3de928879786db2812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.258225 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.272248 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bpkm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz2xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bpkm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.288490 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed818037-beb4-4918-a648-c51549a1b8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 05:53:14.210312 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 05:53:14.211612 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3722328178/tls.crt::/tmp/serving-cert-3722328178/tls.key\\\\\\\"\\\\nI1205 05:53:23.489235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 05:53:23.507619 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 05:53:23.507648 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 05:53:23.507669 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 05:53:23.507675 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 05:53:23.530145 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 05:53:23.530185 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530191 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 05:53:23.530200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 05:53:23.530205 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 05:53:23.530210 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 05:53:23.530546 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 05:53:23.537922 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.363552 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.409981 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e740ad4f-4c03-467b-8f0f-4fec2493d426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g5k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.441942 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.465871 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.502356 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9cxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeace6ed-0114-43b9-a3cb-1eac28798d15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfnmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9cxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.523286 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceda1986-2884-4c10-b39b-0ab350e56ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kmtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.539950 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f189b188-8dd3-4ac5-88ee-8736384c0427\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef72ed83c6c469013e019cc8cfd2fef94b5a732b139335f4872676689764a14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://873f148d20c53be8eb8eaab2ceb12db015b6020bb2014f288d1eec4f858f30ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3c110665bc4baa2c7de39d20e7146d08e4adde3583630662a67eb60a3fcc7bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.574335 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab26569035f876c8d1d404dbbf5a800acd38d95a6b2bd3de928879786db2812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.596269 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://343b34f466580bd7634e4c9a1a20aca0000a609817db41ae527dc89b3b476385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cb18bd7a5b7d0079fde8b898adc71a40e1e9d6e96e6088cc5d2980bbb9b6aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.616714 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bpkm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1a82bf-1dc7-48e4-b2e2-32514537aae7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b35f7cbeb5d671a1296bf1f0c1cd2685ea23b78ab0270bffa3b7e547e85b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz2xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bpkm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.627723 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.627911 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:31.627895204 +0000 UTC m=+30.907906426 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.631183 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed818037-beb4-4918-a648-c51549a1b8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 05:53:14.210312 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 05:53:14.211612 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3722328178/tls.crt::/tmp/serving-cert-3722328178/tls.key\\\\\\\"\\\\nI1205 05:53:23.489235 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 05:53:23.507619 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 05:53:23.507648 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 05:53:23.507669 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 05:53:23.507675 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 05:53:23.530145 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 05:53:23.530185 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530191 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 05:53:23.530196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 05:53:23.530200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 05:53:23.530205 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 05:53:23.530210 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 05:53:23.530546 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 05:53:23.537922 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.647152 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba2b808ed9f1fba846e84a5d5cdf9f8c8b8f263350fcf0d04419e9d4d6c8e706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.674142 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e740ad4f-4c03-467b-8f0f-4fec2493d426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx4n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g5k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.695467 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.723129 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.749447 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9cxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeace6ed-0114-43b9-a3cb-1eac28798d15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e762688c8729230d8bed7e678ff906a844c83e658b2717e39cf0463f3d55ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfnmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9cxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.806524 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceda1986-2884-4c10-b39b-0ab350e56ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea82a91d83c7818c6322b8ec5e0d1627988980a252f067edb5ce6afd9397ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fthfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kmtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.830540 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.846594 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1356a0a-4e64-49b5-b640-3779d3abe333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1535e6216109d3a961855589ba66dbf44c9d14e9f9c0ad646808734c8d44a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pgc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T05:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pgc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T05:53:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hhx2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:27Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.930356 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.930396 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.930416 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:27 crc kubenswrapper[4865]: I1205 05:53:27.930445 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930519 4865 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930565 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:31.930552531 +0000 UTC m=+31.210563753 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930596 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930627 4865 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930635 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930649 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:31.930642383 +0000 UTC m=+31.210653605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930651 4865 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930718 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930723 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:31.930693605 +0000 UTC m=+31.210704827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930731 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930743 4865 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:27 crc kubenswrapper[4865]: E1205 05:53:27.930764 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:31.930757987 +0000 UTC m=+31.210769209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.006194 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:28 crc kubenswrapper[4865]: E1205 05:53:28.006324 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.006194 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:28 crc kubenswrapper[4865]: E1205 05:53:28.006390 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.194614 4865 generic.go:334] "Generic (PLEG): container finished" podID="ceda1986-2884-4c10-b39b-0ab350e56ce0" containerID="ea82a91d83c7818c6322b8ec5e0d1627988980a252f067edb5ce6afd9397ee2a" exitCode=0 Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.194683 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerDied","Data":"ea82a91d83c7818c6322b8ec5e0d1627988980a252f067edb5ce6afd9397ee2a"} Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.200948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.200986 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.200997 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.229289 4865 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T05:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T05:53:28Z is after 2025-08-24T17:21:41Z" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.322528 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9cxx2" podStartSLOduration=3.322508179 podStartE2EDuration="3.322508179s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:28.291003243 +0000 UTC m=+27.571014465" watchObservedRunningTime="2025-12-05 05:53:28.322508179 +0000 UTC m=+27.602519391" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.322757 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jmhlb"] Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.323228 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.326344 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.326544 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.326717 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.326907 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.334373 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12271b64-c823-41b3-9a69-ec3400169975-serviceca\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.334415 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nd2z\" (UniqueName: \"kubernetes.io/projected/12271b64-c823-41b3-9a69-ec3400169975-kube-api-access-8nd2z\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.334516 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12271b64-c823-41b3-9a69-ec3400169975-host\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.438751 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12271b64-c823-41b3-9a69-ec3400169975-serviceca\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.438801 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nd2z\" (UniqueName: \"kubernetes.io/projected/12271b64-c823-41b3-9a69-ec3400169975-kube-api-access-8nd2z\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.438875 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12271b64-c823-41b3-9a69-ec3400169975-host\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.438942 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12271b64-c823-41b3-9a69-ec3400169975-host\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.439725 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12271b64-c823-41b3-9a69-ec3400169975-serviceca\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.457469 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podStartSLOduration=3.457452171 podStartE2EDuration="3.457452171s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:28.418234585 +0000 UTC m=+27.698245817" watchObservedRunningTime="2025-12-05 05:53:28.457452171 +0000 UTC m=+27.737463393" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.457943 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bpkm9" podStartSLOduration=3.457938505 podStartE2EDuration="3.457938505s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:28.457264036 +0000 UTC m=+27.737275258" watchObservedRunningTime="2025-12-05 05:53:28.457938505 +0000 UTC m=+27.737949727" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.469392 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nd2z\" (UniqueName: \"kubernetes.io/projected/12271b64-c823-41b3-9a69-ec3400169975-kube-api-access-8nd2z\") pod \"node-ca-jmhlb\" (UID: \"12271b64-c823-41b3-9a69-ec3400169975\") " pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.490719 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=4.490694888 podStartE2EDuration="4.490694888s" podCreationTimestamp="2025-12-05 05:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:28.486643222 +0000 UTC m=+27.766654444" watchObservedRunningTime="2025-12-05 05:53:28.490694888 +0000 UTC m=+27.770706110" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.646502 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jmhlb" Dec 05 05:53:28 crc kubenswrapper[4865]: I1205 05:53:28.761677 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=4.761661522 podStartE2EDuration="4.761661522s" podCreationTimestamp="2025-12-05 05:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:28.718896024 +0000 UTC m=+27.998907246" watchObservedRunningTime="2025-12-05 05:53:28.761661522 +0000 UTC m=+28.041672744" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.005508 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:29 crc kubenswrapper[4865]: E1205 05:53:29.005632 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.077229 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt"] Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.077612 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.079683 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.083556 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.097309 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-j8p6s"] Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.100439 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: E1205 05:53:29.100533 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.148589 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.148696 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.148725 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfn9x\" (UniqueName: \"kubernetes.io/projected/397265e2-4c59-42f5-8774-357048ba57ac-kube-api-access-mfn9x\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.148848 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.148880 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.148909 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphhx\" (UniqueName: \"kubernetes.io/projected/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-kube-api-access-wphhx\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.212789 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.212848 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.212859 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.214637 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerStarted","Data":"3af2efbb4d7051a3cc11df9bd41495d48d583098573426e32c5c34ca186e50a8"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.215702 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jmhlb" event={"ID":"12271b64-c823-41b3-9a69-ec3400169975","Type":"ContainerStarted","Data":"a2579c11a9f7ffead1803f1a242d796ff84a7cafb6b9feb5da215bb2458f9f1c"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.215727 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jmhlb" event={"ID":"12271b64-c823-41b3-9a69-ec3400169975","Type":"ContainerStarted","Data":"acaa0c0b6792d04638e58fe6e4bccadd73580a4b7ffc4cf8819fb6dd9778dc3e"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.250184 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.250221 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.250248 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfn9x\" (UniqueName: \"kubernetes.io/projected/397265e2-4c59-42f5-8774-357048ba57ac-kube-api-access-mfn9x\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.250300 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.250327 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.250352 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphhx\" (UniqueName: \"kubernetes.io/projected/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-kube-api-access-wphhx\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: E1205 05:53:29.250913 4865 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:29 crc kubenswrapper[4865]: E1205 05:53:29.250979 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs podName:397265e2-4c59-42f5-8774-357048ba57ac nodeName:}" failed. No retries permitted until 2025-12-05 05:53:29.750962911 +0000 UTC m=+29.030974133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs") pod "network-metrics-daemon-j8p6s" (UID: "397265e2-4c59-42f5-8774-357048ba57ac") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.251074 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.251475 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.253280 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jmhlb" podStartSLOduration=4.253268197 podStartE2EDuration="4.253268197s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:29.250507838 +0000 UTC m=+28.530519060" watchObservedRunningTime="2025-12-05 05:53:29.253268197 +0000 UTC m=+28.533279419" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.266194 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.276805 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfn9x\" (UniqueName: \"kubernetes.io/projected/397265e2-4c59-42f5-8774-357048ba57ac-kube-api-access-mfn9x\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.281147 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphhx\" (UniqueName: \"kubernetes.io/projected/1dfe264f-be2c-4470-8f1d-fef4bbcccdb2-kube-api-access-wphhx\") pod \"ovnkube-control-plane-749d76644c-n9wvt\" (UID: \"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.391443 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" Dec 05 05:53:29 crc kubenswrapper[4865]: W1205 05:53:29.404267 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dfe264f_be2c_4470_8f1d_fef4bbcccdb2.slice/crio-23e4184348262422f47161c80e659803e6d2653f175e8fad67bca18e94cb07ae WatchSource:0}: Error finding container 23e4184348262422f47161c80e659803e6d2653f175e8fad67bca18e94cb07ae: Status 404 returned error can't find the container with id 23e4184348262422f47161c80e659803e6d2653f175e8fad67bca18e94cb07ae Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.756438 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:29 crc kubenswrapper[4865]: E1205 05:53:29.756554 4865 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:29 crc kubenswrapper[4865]: E1205 05:53:29.756676 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs podName:397265e2-4c59-42f5-8774-357048ba57ac nodeName:}" failed. No retries permitted until 2025-12-05 05:53:30.756661988 +0000 UTC m=+30.036673200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs") pod "network-metrics-daemon-j8p6s" (UID: "397265e2-4c59-42f5-8774-357048ba57ac") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.878191 4865 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.880233 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.880278 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.880293 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.880428 4865 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.887522 4865 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.887843 4865 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.889151 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.889182 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.889192 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.889203 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.889212 4865 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T05:53:29Z","lastTransitionTime":"2025-12-05T05:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.909220 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.909258 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.909267 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.909280 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.909289 4865 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T05:53:29Z","lastTransitionTime":"2025-12-05T05:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.936412 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7"] Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.936798 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.939818 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.940325 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.940520 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.941326 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.957566 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/23203982-11fe-4867-b6b6-c196169cc70c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.957612 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23203982-11fe-4867-b6b6-c196169cc70c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.957640 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23203982-11fe-4867-b6b6-c196169cc70c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.957674 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/23203982-11fe-4867-b6b6-c196169cc70c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:29 crc kubenswrapper[4865]: I1205 05:53:29.957698 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23203982-11fe-4867-b6b6-c196169cc70c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.006085 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.006108 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:30 crc kubenswrapper[4865]: E1205 05:53:30.006220 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:30 crc kubenswrapper[4865]: E1205 05:53:30.006524 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.058789 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/23203982-11fe-4867-b6b6-c196169cc70c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.058854 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23203982-11fe-4867-b6b6-c196169cc70c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.058896 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23203982-11fe-4867-b6b6-c196169cc70c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.058898 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/23203982-11fe-4867-b6b6-c196169cc70c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.058950 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/23203982-11fe-4867-b6b6-c196169cc70c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.058977 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23203982-11fe-4867-b6b6-c196169cc70c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.059135 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/23203982-11fe-4867-b6b6-c196169cc70c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.059916 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23203982-11fe-4867-b6b6-c196169cc70c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.065238 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23203982-11fe-4867-b6b6-c196169cc70c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.076947 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23203982-11fe-4867-b6b6-c196169cc70c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xmvs7\" (UID: \"23203982-11fe-4867-b6b6-c196169cc70c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.221253 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" event={"ID":"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2","Type":"ContainerStarted","Data":"b72b231390b5ad1ec301129a9477569289be8a3dfc762157397344a682c55e8d"} Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.221303 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" event={"ID":"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2","Type":"ContainerStarted","Data":"9e7059c816a5219d12b71892ea0abca5fb4ef1b3546ca9009f520b55f44a1741"} Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.221314 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" event={"ID":"1dfe264f-be2c-4470-8f1d-fef4bbcccdb2","Type":"ContainerStarted","Data":"23e4184348262422f47161c80e659803e6d2653f175e8fad67bca18e94cb07ae"} Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.223753 4865 generic.go:334] "Generic (PLEG): container finished" podID="ceda1986-2884-4c10-b39b-0ab350e56ce0" containerID="3af2efbb4d7051a3cc11df9bd41495d48d583098573426e32c5c34ca186e50a8" exitCode=0 Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.223805 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerDied","Data":"3af2efbb4d7051a3cc11df9bd41495d48d583098573426e32c5c34ca186e50a8"} Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.242007 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n9wvt" podStartSLOduration=5.241989335 podStartE2EDuration="5.241989335s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:30.241007637 +0000 UTC m=+29.521018869" watchObservedRunningTime="2025-12-05 05:53:30.241989335 +0000 UTC m=+29.522000567" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.251170 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.764151 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:30 crc kubenswrapper[4865]: E1205 05:53:30.764355 4865 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:30 crc kubenswrapper[4865]: E1205 05:53:30.764655 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs podName:397265e2-4c59-42f5-8774-357048ba57ac nodeName:}" failed. No retries permitted until 2025-12-05 05:53:32.764636745 +0000 UTC m=+32.044647967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs") pod "network-metrics-daemon-j8p6s" (UID: "397265e2-4c59-42f5-8774-357048ba57ac") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:30 crc kubenswrapper[4865]: I1205 05:53:30.840126 4865 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 05 05:53:30 crc kubenswrapper[4865]: W1205 05:53:30.840470 4865 reflector.go:484] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": watch of *v1.Secret ended with: very short watch: object-"openshift-cluster-version"/"default-dockercfg-gxtc4": Unexpected watch close - watch lasted less than a second and no items received Dec 05 05:53:30 crc kubenswrapper[4865]: W1205 05:53:30.841189 4865 reflector.go:484] object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 05 05:53:30 crc kubenswrapper[4865]: W1205 05:53:30.841779 4865 reflector.go:484] object-"openshift-cluster-version"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-cluster-version"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 05:53:30 crc kubenswrapper[4865]: W1205 05:53:30.842097 4865 reflector.go:484] object-"openshift-cluster-version"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-cluster-version"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.007061 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.007178 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.007556 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.007621 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.232732 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.234941 4865 generic.go:334] "Generic (PLEG): container finished" podID="ceda1986-2884-4c10-b39b-0ab350e56ce0" containerID="a25e790b265b79d22f8ac83beaa349a74ed8167edfd56092e85f4686e9ac1075" exitCode=0 Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.234985 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerDied","Data":"a25e790b265b79d22f8ac83beaa349a74ed8167edfd56092e85f4686e9ac1075"} Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.236305 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" event={"ID":"23203982-11fe-4867-b6b6-c196169cc70c","Type":"ContainerStarted","Data":"9c47b4c4e09a6e5db95196e37204b41a1aa1b33d7440d241f229b5c449543af4"} Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.236364 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" event={"ID":"23203982-11fe-4867-b6b6-c196169cc70c","Type":"ContainerStarted","Data":"b9bbd1a8f59b73c000eae9ddd16f538692c7a36a937bebde55d3bc3322794fa5"} Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.677240 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.677404 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:39.67738997 +0000 UTC m=+38.957401192 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.702998 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.922312 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.981457 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.981520 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.981547 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:31 crc kubenswrapper[4865]: I1205 05:53:31.981571 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981625 4865 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981687 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981699 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:39.981681283 +0000 UTC m=+39.261692505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981705 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981721 4865 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981746 4865 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981785 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:39.981744445 +0000 UTC m=+39.261755667 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981752 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981850 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:39.981812277 +0000 UTC m=+39.261823499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981862 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981878 4865 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:31 crc kubenswrapper[4865]: E1205 05:53:31.981928 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:39.98191383 +0000 UTC m=+39.261925152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.006452 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.006467 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:32 crc kubenswrapper[4865]: E1205 05:53:32.006612 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:32 crc kubenswrapper[4865]: E1205 05:53:32.006634 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.242332 4865 generic.go:334] "Generic (PLEG): container finished" podID="ceda1986-2884-4c10-b39b-0ab350e56ce0" containerID="68dd4461cbe8816fd621eae931743cd55cafdbbf92da919c5052af1ecbe143da" exitCode=0 Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.242375 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerDied","Data":"68dd4461cbe8816fd621eae931743cd55cafdbbf92da919c5052af1ecbe143da"} Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.261215 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xmvs7" podStartSLOduration=7.261199681 podStartE2EDuration="7.261199681s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:31.278333899 +0000 UTC m=+30.558345121" watchObservedRunningTime="2025-12-05 05:53:32.261199681 +0000 UTC m=+31.541210903" Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.375832 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.411131 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 05:53:32 crc kubenswrapper[4865]: I1205 05:53:32.788491 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:32 crc kubenswrapper[4865]: E1205 05:53:32.788610 4865 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:32 crc kubenswrapper[4865]: E1205 05:53:32.788662 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs podName:397265e2-4c59-42f5-8774-357048ba57ac nodeName:}" failed. No retries permitted until 2025-12-05 05:53:36.788650356 +0000 UTC m=+36.068661578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs") pod "network-metrics-daemon-j8p6s" (UID: "397265e2-4c59-42f5-8774-357048ba57ac") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:33 crc kubenswrapper[4865]: I1205 05:53:33.005580 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:33 crc kubenswrapper[4865]: E1205 05:53:33.005708 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:33 crc kubenswrapper[4865]: I1205 05:53:33.005768 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:33 crc kubenswrapper[4865]: E1205 05:53:33.005840 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:34 crc kubenswrapper[4865]: I1205 05:53:34.005557 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:34 crc kubenswrapper[4865]: I1205 05:53:34.005572 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:34 crc kubenswrapper[4865]: E1205 05:53:34.006027 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:34 crc kubenswrapper[4865]: E1205 05:53:34.006098 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:34 crc kubenswrapper[4865]: I1205 05:53:34.378684 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:53:35 crc kubenswrapper[4865]: I1205 05:53:35.006135 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:35 crc kubenswrapper[4865]: I1205 05:53:35.006250 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:35 crc kubenswrapper[4865]: E1205 05:53:35.006410 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:35 crc kubenswrapper[4865]: E1205 05:53:35.006528 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.006206 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:36 crc kubenswrapper[4865]: E1205 05:53:36.006721 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.007131 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:36 crc kubenswrapper[4865]: E1205 05:53:36.007194 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.276004 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerStarted","Data":"1ae1a7cf526b811045eb71aec1545de07ccad604486cfc1d3fc85352453f7cd8"} Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.281673 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerStarted","Data":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.282205 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.346059 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podStartSLOduration=11.346036182 podStartE2EDuration="11.346036182s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:36.341665607 +0000 UTC m=+35.621676829" watchObservedRunningTime="2025-12-05 05:53:36.346036182 +0000 UTC m=+35.626047404" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.429647 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.598022 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.603526 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:36 crc kubenswrapper[4865]: I1205 05:53:36.837225 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:36 crc kubenswrapper[4865]: E1205 05:53:36.837371 4865 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:36 crc kubenswrapper[4865]: E1205 05:53:36.837429 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs podName:397265e2-4c59-42f5-8774-357048ba57ac nodeName:}" failed. No retries permitted until 2025-12-05 05:53:44.837412081 +0000 UTC m=+44.117423303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs") pod "network-metrics-daemon-j8p6s" (UID: "397265e2-4c59-42f5-8774-357048ba57ac") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 05:53:37 crc kubenswrapper[4865]: I1205 05:53:37.005444 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:37 crc kubenswrapper[4865]: I1205 05:53:37.005514 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:37 crc kubenswrapper[4865]: E1205 05:53:37.005772 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:37 crc kubenswrapper[4865]: E1205 05:53:37.005974 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:37 crc kubenswrapper[4865]: I1205 05:53:37.288259 4865 generic.go:334] "Generic (PLEG): container finished" podID="ceda1986-2884-4c10-b39b-0ab350e56ce0" containerID="1ae1a7cf526b811045eb71aec1545de07ccad604486cfc1d3fc85352453f7cd8" exitCode=0 Dec 05 05:53:37 crc kubenswrapper[4865]: I1205 05:53:37.288384 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 05:53:37 crc kubenswrapper[4865]: I1205 05:53:37.288955 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerDied","Data":"1ae1a7cf526b811045eb71aec1545de07ccad604486cfc1d3fc85352453f7cd8"} Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.005858 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.005899 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:38 crc kubenswrapper[4865]: E1205 05:53:38.006001 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:38 crc kubenswrapper[4865]: E1205 05:53:38.006165 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.240024 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j8p6s"] Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.240139 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:38 crc kubenswrapper[4865]: E1205 05:53:38.240218 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.294491 4865 generic.go:334] "Generic (PLEG): container finished" podID="ceda1986-2884-4c10-b39b-0ab350e56ce0" containerID="9b94c05199058383904a818aed1e602e8a430287010e546a7297cc2cf3e50638" exitCode=0 Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.294565 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerDied","Data":"9b94c05199058383904a818aed1e602e8a430287010e546a7297cc2cf3e50638"} Dec 05 05:53:38 crc kubenswrapper[4865]: I1205 05:53:38.294605 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 05:53:39 crc kubenswrapper[4865]: I1205 05:53:39.006291 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:39 crc kubenswrapper[4865]: E1205 05:53:39.006797 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 05:53:39 crc kubenswrapper[4865]: I1205 05:53:39.300850 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" event={"ID":"ceda1986-2884-4c10-b39b-0ab350e56ce0","Type":"ContainerStarted","Data":"04eb9eb9e061f984bfb2d60b2a3086a862c373122f81d85c14acbb5f7baa4fa5"} Dec 05 05:53:39 crc kubenswrapper[4865]: I1205 05:53:39.326669 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kmtnk" podStartSLOduration=14.326653488 podStartE2EDuration="14.326653488s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:39.325889577 +0000 UTC m=+38.605900809" watchObservedRunningTime="2025-12-05 05:53:39.326653488 +0000 UTC m=+38.606664710" Dec 05 05:53:39 crc kubenswrapper[4865]: I1205 05:53:39.371171 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:53:39 crc kubenswrapper[4865]: I1205 05:53:39.767024 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:39 crc kubenswrapper[4865]: E1205 05:53:39.767295 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.767260822 +0000 UTC m=+55.047272044 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.005882 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.005996 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.006330 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.006427 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.006586 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.006690 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8p6s" podUID="397265e2-4c59-42f5-8774-357048ba57ac" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.070301 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.070350 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.070381 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.070407 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070435 4865 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070491 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.070476454 +0000 UTC m=+55.350487676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070520 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070559 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070572 4865 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070613 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.070602787 +0000 UTC m=+55.350614009 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070625 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070673 4865 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070692 4865 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070644 4865 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070765 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.070741951 +0000 UTC m=+55.350753203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 05:53:40 crc kubenswrapper[4865]: E1205 05:53:40.070793 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.070782762 +0000 UTC m=+55.350793984 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.818185 4865 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.818426 4865 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.857224 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pv5k9"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.857857 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.857947 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrb2j"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.858766 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.861491 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.861698 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.862180 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.862559 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.862711 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.862713 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.862881 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.863746 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82xnx"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.864479 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.864501 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.864991 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.883078 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.885527 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.885543 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.885710 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888160 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888325 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888481 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888566 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888654 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888728 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888328 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.889793 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.889938 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.890041 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.890140 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.890277 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.890500 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.890635 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.888729 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.891062 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.891506 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.892566 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.892847 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.894172 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.894619 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.894952 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.895622 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.895745 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.896242 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.896781 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.902465 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.902947 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.904435 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.905526 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.905586 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.907700 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.919518 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-95g8c"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.921452 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.921571 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.922229 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.922805 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xhxgz"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.923105 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.923496 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928222 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928269 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928345 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928526 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928543 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928633 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928695 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928722 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928795 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.928853 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.929857 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.934086 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.934271 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.938093 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.942509 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.942665 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.942998 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.945381 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.945545 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v2jhh"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.945606 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.946556 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.947983 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.951295 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d6z25"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.951726 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.952035 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.952414 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-h6t2z"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.952800 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.953207 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.953460 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.953763 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.954103 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.957221 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.959243 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.962289 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lclsv"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.962691 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vqhhk"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.979197 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fth8"] Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.980264 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.981002 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.982396 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.983389 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.983637 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.983849 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.984164 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.984231 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.984281 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 05:53:40 crc kubenswrapper[4865]: I1205 05:53:40.984434 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.006132 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.006321 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.006437 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.006546 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.006760 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007149 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007179 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-encryption-config\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007200 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7jn\" (UniqueName: \"kubernetes.io/projected/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-kube-api-access-ch7jn\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007218 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-policies\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007233 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c9fd82-77d0-4c86-9ff6-0489fbeab324-config\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007248 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-audit\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007263 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007277 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/320fa4b0-5e00-4bca-b8f2-1afa7387d156-node-pullsecrets\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007295 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-config\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007314 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m84hq\" (UniqueName: \"kubernetes.io/projected/320fa4b0-5e00-4bca-b8f2-1afa7387d156-kube-api-access-m84hq\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007334 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nssgl\" (UniqueName: \"kubernetes.io/projected/d3782972-9e30-42ae-9af7-9cba4bdcbfe3-kube-api-access-nssgl\") pod \"cluster-samples-operator-665b6dd947-sbmfc\" (UID: \"d3782972-9e30-42ae-9af7-9cba4bdcbfe3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007410 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-config\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007433 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xgw\" (UniqueName: \"kubernetes.io/projected/bae52c2b-77f5-4196-804b-e72254ce6f24-kube-api-access-k2xgw\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007455 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbxk\" (UniqueName: \"kubernetes.io/projected/4ff00812-1a0c-4bbc-8222-d7765505af6b-kube-api-access-9qbxk\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007478 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-serving-cert\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007500 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007545 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5np\" (UniqueName: \"kubernetes.io/projected/d02c7cde-9b06-4366-9ad0-b46d403446b1-kube-api-access-zl5np\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007563 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c9fd82-77d0-4c86-9ff6-0489fbeab324-serving-cert\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007583 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-client-ca\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007608 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007632 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007655 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-serving-cert\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007676 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bae52c2b-77f5-4196-804b-e72254ce6f24-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007698 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5pnt\" (UniqueName: \"kubernetes.io/projected/445165fe-92bf-4fb9-9f6e-4f8101278621-kube-api-access-r5pnt\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007728 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007753 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007788 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnswt\" (UniqueName: \"kubernetes.io/projected/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-kube-api-access-tnswt\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007906 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-etcd-client\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007924 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.007932 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/445165fe-92bf-4fb9-9f6e-4f8101278621-machine-approver-tls\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008179 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-audit-policies\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008200 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff00812-1a0c-4bbc-8222-d7765505af6b-serving-cert\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008222 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008237 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhl28\" (UniqueName: \"kubernetes.io/projected/8cc2c35c-7bb7-4475-a318-0133139b9359-kube-api-access-hhl28\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008260 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008281 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008296 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-config\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008315 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/320fa4b0-5e00-4bca-b8f2-1afa7387d156-audit-dir\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008330 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60c9fd82-77d0-4c86-9ff6-0489fbeab324-trusted-ca\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008346 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-serving-cert\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008368 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008389 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008524 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008543 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008558 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqmx\" (UniqueName: \"kubernetes.io/projected/60c9fd82-77d0-4c86-9ff6-0489fbeab324-kube-api-access-wmqmx\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008579 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008597 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/445165fe-92bf-4fb9-9f6e-4f8101278621-config\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008611 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3782972-9e30-42ae-9af7-9cba4bdcbfe3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sbmfc\" (UID: \"d3782972-9e30-42ae-9af7-9cba4bdcbfe3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008625 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008640 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szwj6\" (UniqueName: \"kubernetes.io/projected/87a82cae-057c-47d5-9703-eb48128e1bd9-kube-api-access-szwj6\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008657 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae52c2b-77f5-4196-804b-e72254ce6f24-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008673 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wl8\" (UniqueName: \"kubernetes.io/projected/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-kube-api-access-56wl8\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008716 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/445165fe-92bf-4fb9-9f6e-4f8101278621-auth-proxy-config\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008744 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-etcd-client\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008758 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-config\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008778 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-client-ca\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008795 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cc2c35c-7bb7-4475-a318-0133139b9359-serving-cert\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008814 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d02c7cde-9b06-4366-9ad0-b46d403446b1-audit-dir\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008843 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-service-ca-bundle\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008859 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008878 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-dir\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008899 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-serving-cert\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008916 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-etcd-serving-ca\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008931 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-image-import-ca\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008964 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008980 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-encryption-config\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.008994 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.009232 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.009411 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.009557 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010164 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010311 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010431 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010544 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010670 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010804 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.010946 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011381 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011399 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011538 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011579 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011635 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011657 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.011731 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.012045 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.012319 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.012689 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.012966 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013115 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013215 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013310 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013409 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013507 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013601 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.013728 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.014181 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.014650 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.014758 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.014865 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.014912 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.015440 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.016205 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.016325 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.020348 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.020860 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cjczz"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.023463 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.025282 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.025783 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.026452 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.029174 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.029458 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.029889 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030290 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030373 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030296 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrb2j"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030461 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030604 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030754 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.030796 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.031251 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.031529 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.031923 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.032113 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.032326 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.034544 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.037943 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-drrtb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.038917 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.047128 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cl4kt"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.047791 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.049152 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.060059 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.064065 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.064843 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.066255 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.067171 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.068999 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.069789 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.077996 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.078936 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8thws"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.079323 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.079406 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.080975 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.081871 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pv5k9"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.083240 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.083683 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.084003 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.084628 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.085576 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xhxgz"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.086319 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82xnx"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.087284 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.088115 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.089079 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.092170 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.093119 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vqhhk"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.099163 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v2jhh"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.099225 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fth8"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.099237 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.101675 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.106395 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-95g8c"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.109693 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.109862 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-encryption-config\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.109974 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7jn\" (UniqueName: \"kubernetes.io/projected/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-kube-api-access-ch7jn\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.110082 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-policies\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.110172 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c9fd82-77d0-4c86-9ff6-0489fbeab324-config\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.110246 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-audit\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.110324 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.110413 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/320fa4b0-5e00-4bca-b8f2-1afa7387d156-node-pullsecrets\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.110501 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-config\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111485 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m84hq\" (UniqueName: \"kubernetes.io/projected/320fa4b0-5e00-4bca-b8f2-1afa7387d156-kube-api-access-m84hq\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111573 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nssgl\" (UniqueName: \"kubernetes.io/projected/d3782972-9e30-42ae-9af7-9cba4bdcbfe3-kube-api-access-nssgl\") pod \"cluster-samples-operator-665b6dd947-sbmfc\" (UID: \"d3782972-9e30-42ae-9af7-9cba4bdcbfe3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111675 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-config\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111746 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgw\" (UniqueName: \"kubernetes.io/projected/bae52c2b-77f5-4196-804b-e72254ce6f24-kube-api-access-k2xgw\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111841 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-service-ca\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111945 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-serving-cert\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112043 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4frk\" (UniqueName: \"kubernetes.io/projected/26c8d7ff-d655-4d40-b99c-e11daec5b263-kube-api-access-m4frk\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112115 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbxk\" (UniqueName: \"kubernetes.io/projected/4ff00812-1a0c-4bbc-8222-d7765505af6b-kube-api-access-9qbxk\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112183 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112256 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5np\" (UniqueName: \"kubernetes.io/projected/d02c7cde-9b06-4366-9ad0-b46d403446b1-kube-api-access-zl5np\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112327 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c9fd82-77d0-4c86-9ff6-0489fbeab324-serving-cert\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112394 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-client-ca\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112459 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111027 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112588 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-config\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112648 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c9fd82-77d0-4c86-9ff6-0489fbeab324-config\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112713 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112783 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-config\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112886 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc594832-ffe0-4764-95ac-f62739f0314e-config\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113003 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-serving-cert\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113083 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc34f331-e440-4ee3-98db-070d3115e2fd-serving-cert\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113161 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bae52c2b-77f5-4196-804b-e72254ce6f24-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113232 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26c8d7ff-d655-4d40-b99c-e11daec5b263-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113296 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2l69\" (UniqueName: \"kubernetes.io/projected/d5a2cf4e-5936-433e-ae66-9906820ebe95-kube-api-access-b2l69\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113372 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5pnt\" (UniqueName: \"kubernetes.io/projected/445165fe-92bf-4fb9-9f6e-4f8101278621-kube-api-access-r5pnt\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113423 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-audit\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113441 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113562 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26c8d7ff-d655-4d40-b99c-e11daec5b263-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.113637 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.114770 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-client-ca\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.111389 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lclsv"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.117412 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cjczz"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.117503 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.117584 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7lcjb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.115911 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-config\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.118962 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.112055 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/320fa4b0-5e00-4bca-b8f2-1afa7387d156-node-pullsecrets\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.119867 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-serving-cert\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.115292 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-policies\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.125212 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ps4tl"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.126169 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.126559 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127106 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnswt\" (UniqueName: \"kubernetes.io/projected/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-kube-api-access-tnswt\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127228 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-etcd-client\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127361 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/445165fe-92bf-4fb9-9f6e-4f8101278621-machine-approver-tls\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127455 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-audit-policies\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127552 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff00812-1a0c-4bbc-8222-d7765505af6b-serving-cert\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127688 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.127898 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128048 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128145 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhl28\" (UniqueName: \"kubernetes.io/projected/8cc2c35c-7bb7-4475-a318-0133139b9359-kube-api-access-hhl28\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128214 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-config\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128286 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/26c8d7ff-d655-4d40-b99c-e11daec5b263-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128377 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f657509-34dd-4ea9-84fb-ce4548fde2f3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128451 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128604 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/320fa4b0-5e00-4bca-b8f2-1afa7387d156-audit-dir\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128692 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f657509-34dd-4ea9-84fb-ce4548fde2f3-metrics-tls\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128799 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-serving-cert\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128892 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60c9fd82-77d0-4c86-9ff6-0489fbeab324-trusted-ca\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.128984 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.129058 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f657509-34dd-4ea9-84fb-ce4548fde2f3-trusted-ca\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.130277 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.130318 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.131749 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60c9fd82-77d0-4c86-9ff6-0489fbeab324-trusted-ca\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.131787 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/320fa4b0-5e00-4bca-b8f2-1afa7387d156-audit-dir\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.132614 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-encryption-config\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.133479 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134099 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134226 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134232 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134811 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134879 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134903 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134926 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqmx\" (UniqueName: \"kubernetes.io/projected/60c9fd82-77d0-4c86-9ff6-0489fbeab324-kube-api-access-wmqmx\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134950 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-ca\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.134977 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxbfz\" (UniqueName: \"kubernetes.io/projected/3f657509-34dd-4ea9-84fb-ce4548fde2f3-kube-api-access-dxbfz\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135005 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135023 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrjwg\" (UniqueName: \"kubernetes.io/projected/dc34f331-e440-4ee3-98db-070d3115e2fd-kube-api-access-xrjwg\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135046 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/445165fe-92bf-4fb9-9f6e-4f8101278621-config\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135069 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3782972-9e30-42ae-9af7-9cba4bdcbfe3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sbmfc\" (UID: \"d3782972-9e30-42ae-9af7-9cba4bdcbfe3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135188 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135217 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szwj6\" (UniqueName: \"kubernetes.io/projected/87a82cae-057c-47d5-9703-eb48128e1bd9-kube-api-access-szwj6\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135235 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae52c2b-77f5-4196-804b-e72254ce6f24-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135258 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wl8\" (UniqueName: \"kubernetes.io/projected/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-kube-api-access-56wl8\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135280 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5a2cf4e-5936-433e-ae66-9906820ebe95-profile-collector-cert\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135320 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/445165fe-92bf-4fb9-9f6e-4f8101278621-auth-proxy-config\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135339 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-etcd-client\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135359 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-config\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135382 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc594832-ffe0-4764-95ac-f62739f0314e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.135734 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-serving-cert\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.136549 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae52c2b-77f5-4196-804b-e72254ce6f24-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.137697 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.138776 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.139382 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.140001 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/445165fe-92bf-4fb9-9f6e-4f8101278621-config\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.141184 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.141519 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-client-ca\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.141603 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cc2c35c-7bb7-4475-a318-0133139b9359-serving-cert\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.141639 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5a2cf4e-5936-433e-ae66-9906820ebe95-srv-cert\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.141686 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c9fd82-77d0-4c86-9ff6-0489fbeab324-serving-cert\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.141909 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-config\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.142003 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/445165fe-92bf-4fb9-9f6e-4f8101278621-auth-proxy-config\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.142092 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-etcd-client\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.142611 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.142628 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-client-ca\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143383 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3782972-9e30-42ae-9af7-9cba4bdcbfe3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sbmfc\" (UID: \"d3782972-9e30-42ae-9af7-9cba4bdcbfe3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143428 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/320fa4b0-5e00-4bca-b8f2-1afa7387d156-etcd-client\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143466 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d02c7cde-9b06-4366-9ad0-b46d403446b1-audit-dir\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143498 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-service-ca-bundle\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143526 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc594832-ffe0-4764-95ac-f62739f0314e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143573 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d02c7cde-9b06-4366-9ad0-b46d403446b1-audit-dir\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143725 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.143768 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144118 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144334 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144403 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-service-ca-bundle\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144630 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-dir\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144687 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-serving-cert\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144719 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-dir\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144877 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-image-import-ca\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.144928 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-client\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.145225 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d02c7cde-9b06-4366-9ad0-b46d403446b1-audit-policies\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.145564 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-etcd-serving-ca\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.145709 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.145748 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-encryption-config\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.145928 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d6z25"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.146201 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-image-import-ca\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.146549 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/320fa4b0-5e00-4bca-b8f2-1afa7387d156-etcd-serving-ca\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.146723 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.148051 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-config\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.148443 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.149139 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.149808 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.149818 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bae52c2b-77f5-4196-804b-e72254ce6f24-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.150330 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-serving-cert\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.151192 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.152046 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff00812-1a0c-4bbc-8222-d7765505af6b-serving-cert\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.152183 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d02c7cde-9b06-4366-9ad0-b46d403446b1-encryption-config\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.152216 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.152672 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.153101 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/445165fe-92bf-4fb9-9f6e-4f8101278621-machine-approver-tls\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.153628 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cc2c35c-7bb7-4475-a318-0133139b9359-serving-cert\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.154037 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.154059 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.159717 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.159817 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-k9x8b"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.161209 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.163894 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.168481 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-serving-cert\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.174906 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-26v4f"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.175745 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.176429 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.177765 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.180804 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.182560 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.183985 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wsxc7"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.185217 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.185341 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.191055 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-drrtb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.197366 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.199360 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.199749 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.200882 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.203253 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cl4kt"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.205410 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.207068 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.208866 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8thws"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.209924 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ps4tl"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.210921 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7lcjb"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.211768 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wsxc7"] Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.220050 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.240625 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248382 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-client\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248462 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-service-ca\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248482 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4frk\" (UniqueName: \"kubernetes.io/projected/26c8d7ff-d655-4d40-b99c-e11daec5b263-kube-api-access-m4frk\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248524 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc594832-ffe0-4764-95ac-f62739f0314e-config\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248541 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-config\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248556 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc34f331-e440-4ee3-98db-070d3115e2fd-serving-cert\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248572 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2l69\" (UniqueName: \"kubernetes.io/projected/d5a2cf4e-5936-433e-ae66-9906820ebe95-kube-api-access-b2l69\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248595 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26c8d7ff-d655-4d40-b99c-e11daec5b263-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248612 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26c8d7ff-d655-4d40-b99c-e11daec5b263-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248659 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/26c8d7ff-d655-4d40-b99c-e11daec5b263-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248676 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f657509-34dd-4ea9-84fb-ce4548fde2f3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248696 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f657509-34dd-4ea9-84fb-ce4548fde2f3-metrics-tls\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248716 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f657509-34dd-4ea9-84fb-ce4548fde2f3-trusted-ca\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248851 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-ca\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248870 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxbfz\" (UniqueName: \"kubernetes.io/projected/3f657509-34dd-4ea9-84fb-ce4548fde2f3-kube-api-access-dxbfz\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248884 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrjwg\" (UniqueName: \"kubernetes.io/projected/dc34f331-e440-4ee3-98db-070d3115e2fd-kube-api-access-xrjwg\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.248911 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5a2cf4e-5936-433e-ae66-9906820ebe95-profile-collector-cert\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.249126 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc594832-ffe0-4764-95ac-f62739f0314e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.249146 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5a2cf4e-5936-433e-ae66-9906820ebe95-srv-cert\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.249165 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc594832-ffe0-4764-95ac-f62739f0314e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.249462 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc594832-ffe0-4764-95ac-f62739f0314e-config\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.251398 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f657509-34dd-4ea9-84fb-ce4548fde2f3-trusted-ca\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.252257 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26c8d7ff-d655-4d40-b99c-e11daec5b263-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.253174 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc594832-ffe0-4764-95ac-f62739f0314e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.254298 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f657509-34dd-4ea9-84fb-ce4548fde2f3-metrics-tls\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.260278 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.280173 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.300065 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.321388 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.340033 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.361167 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.379931 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.400152 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.419815 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.446448 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.481179 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.494069 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/26c8d7ff-d655-4d40-b99c-e11daec5b263-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.501145 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.520959 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.541148 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.560355 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.580174 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.599548 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.600381 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-config\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.620540 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.633468 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc34f331-e440-4ee3-98db-070d3115e2fd-serving-cert\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.641149 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.652284 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-client\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.659993 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.680640 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.692071 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-ca\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.700632 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.710360 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc34f331-e440-4ee3-98db-070d3115e2fd-etcd-service-ca\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.719866 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.740962 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.761373 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.780042 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.800614 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.820025 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.840753 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.861231 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.880891 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.900055 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.920754 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.941341 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.960264 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 05:53:41 crc kubenswrapper[4865]: I1205 05:53:41.980055 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.000188 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.006155 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.006166 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.006177 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.020308 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.038212 4865 request.go:700] Waited for 1.005958409s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.040160 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.060600 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.080276 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.099840 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.121217 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.134093 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5a2cf4e-5936-433e-ae66-9906820ebe95-srv-cert\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.141052 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.161099 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.173533 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5a2cf4e-5936-433e-ae66-9906820ebe95-profile-collector-cert\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.182329 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.200391 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.220616 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.240927 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.260774 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.279966 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.320119 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.340082 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.359858 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.380932 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.399776 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.420729 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.440425 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.460440 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.480039 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.500390 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.527469 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.540056 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.560612 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.579629 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.600047 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.619927 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.640282 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.659994 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.680356 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.720506 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m84hq\" (UniqueName: \"kubernetes.io/projected/320fa4b0-5e00-4bca-b8f2-1afa7387d156-kube-api-access-m84hq\") pod \"apiserver-76f77b778f-pv5k9\" (UID: \"320fa4b0-5e00-4bca-b8f2-1afa7387d156\") " pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.740015 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7jn\" (UniqueName: \"kubernetes.io/projected/dabab9ed-be46-4cbf-a0e5-8e3679b3b434-kube-api-access-ch7jn\") pod \"openshift-apiserver-operator-796bbdcf4f-z77ns\" (UID: \"dabab9ed-be46-4cbf-a0e5-8e3679b3b434\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.754082 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nssgl\" (UniqueName: \"kubernetes.io/projected/d3782972-9e30-42ae-9af7-9cba4bdcbfe3-kube-api-access-nssgl\") pod \"cluster-samples-operator-665b6dd947-sbmfc\" (UID: \"d3782972-9e30-42ae-9af7-9cba4bdcbfe3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.756502 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.775203 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2xgw\" (UniqueName: \"kubernetes.io/projected/bae52c2b-77f5-4196-804b-e72254ce6f24-kube-api-access-k2xgw\") pod \"openshift-controller-manager-operator-756b6f6bc6-5vtbs\" (UID: \"bae52c2b-77f5-4196-804b-e72254ce6f24\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.779522 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.801555 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.820284 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.839735 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.859758 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.862106 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.880063 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.901793 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.945170 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnswt\" (UniqueName: \"kubernetes.io/projected/5f6c0508-ee1a-4224-bdc6-f6f7af78d75a-kube-api-access-tnswt\") pod \"authentication-operator-69f744f599-xhxgz\" (UID: \"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.956195 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqmx\" (UniqueName: \"kubernetes.io/projected/60c9fd82-77d0-4c86-9ff6-0489fbeab324-kube-api-access-wmqmx\") pod \"console-operator-58897d9998-95g8c\" (UID: \"60c9fd82-77d0-4c86-9ff6-0489fbeab324\") " pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.971967 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:42 crc kubenswrapper[4865]: I1205 05:53:42.984431 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5np\" (UniqueName: \"kubernetes.io/projected/d02c7cde-9b06-4366-9ad0-b46d403446b1-kube-api-access-zl5np\") pod \"apiserver-7bbb656c7d-mfc6c\" (UID: \"d02c7cde-9b06-4366-9ad0-b46d403446b1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.021946 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhl28\" (UniqueName: \"kubernetes.io/projected/8cc2c35c-7bb7-4475-a318-0133139b9359-kube-api-access-hhl28\") pod \"controller-manager-879f6c89f-hrb2j\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.022833 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbxk\" (UniqueName: \"kubernetes.io/projected/4ff00812-1a0c-4bbc-8222-d7765505af6b-kube-api-access-9qbxk\") pod \"route-controller-manager-6576b87f9c-k5gw9\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.038225 4865 request.go:700] Waited for 1.901043571s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.038282 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.040898 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5pnt\" (UniqueName: \"kubernetes.io/projected/445165fe-92bf-4fb9-9f6e-4f8101278621-kube-api-access-r5pnt\") pod \"machine-approver-56656f9798-rp2kv\" (UID: \"445165fe-92bf-4fb9-9f6e-4f8101278621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.042778 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" Dec 05 05:53:43 crc kubenswrapper[4865]: W1205 05:53:43.045058 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabab9ed_be46_4cbf_a0e5_8e3679b3b434.slice/crio-013d538d6edd4bdfef467383fc3f52aa701a22ac11e958a29bf6cae668a31150 WatchSource:0}: Error finding container 013d538d6edd4bdfef467383fc3f52aa701a22ac11e958a29bf6cae668a31150: Status 404 returned error can't find the container with id 013d538d6edd4bdfef467383fc3f52aa701a22ac11e958a29bf6cae668a31150 Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.055693 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wl8\" (UniqueName: \"kubernetes.io/projected/4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00-kube-api-access-56wl8\") pod \"openshift-config-operator-7777fb866f-dxzfq\" (UID: \"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.084522 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.086952 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szwj6\" (UniqueName: \"kubernetes.io/projected/87a82cae-057c-47d5-9703-eb48128e1bd9-kube-api-access-szwj6\") pod \"oauth-openshift-558db77b4-82xnx\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.101125 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.119745 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.124831 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.124935 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.130546 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.141441 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.149893 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.159994 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.180053 4865 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.182070 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.201172 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.213092 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.239742 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4frk\" (UniqueName: \"kubernetes.io/projected/26c8d7ff-d655-4d40-b99c-e11daec5b263-kube-api-access-m4frk\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.258855 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pv5k9"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.260315 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2l69\" (UniqueName: \"kubernetes.io/projected/d5a2cf4e-5936-433e-ae66-9906820ebe95-kube-api-access-b2l69\") pod \"catalog-operator-68c6474976-bg82r\" (UID: \"d5a2cf4e-5936-433e-ae66-9906820ebe95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.281258 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.283805 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxbfz\" (UniqueName: \"kubernetes.io/projected/3f657509-34dd-4ea9-84fb-ce4548fde2f3-kube-api-access-dxbfz\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.293909 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.303486 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f657509-34dd-4ea9-84fb-ce4548fde2f3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w28tr\" (UID: \"3f657509-34dd-4ea9-84fb-ce4548fde2f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.320226 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" event={"ID":"445165fe-92bf-4fb9-9f6e-4f8101278621","Type":"ContainerStarted","Data":"e453fae0a85ae24a2ce16cb2a5ca6d7018cdec03191aebdec00b17e3b372e7c9"} Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.325674 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.329350 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" event={"ID":"dabab9ed-be46-4cbf-a0e5-8e3679b3b434","Type":"ContainerStarted","Data":"013d538d6edd4bdfef467383fc3f52aa701a22ac11e958a29bf6cae668a31150"} Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.330334 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" event={"ID":"320fa4b0-5e00-4bca-b8f2-1afa7387d156","Type":"ContainerStarted","Data":"43290c2c303ba07b032cd70c4a85e6b2c670faf19de30606c67b283c7146c60e"} Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.330521 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.332508 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrjwg\" (UniqueName: \"kubernetes.io/projected/dc34f331-e440-4ee3-98db-070d3115e2fd-kube-api-access-xrjwg\") pod \"etcd-operator-b45778765-cjczz\" (UID: \"dc34f331-e440-4ee3-98db-070d3115e2fd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.332643 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.338014 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc594832-ffe0-4764-95ac-f62739f0314e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v5wv7\" (UID: \"cc594832-ffe0-4764-95ac-f62739f0314e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.355563 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26c8d7ff-d655-4d40-b99c-e11daec5b263-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6s5ts\" (UID: \"26c8d7ff-d655-4d40-b99c-e11daec5b263\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.381578 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.383307 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.398337 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.400604 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.420228 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.441527 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.461259 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.462608 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.507639 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.537510 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-95g8c"] Dec 05 05:53:43 crc kubenswrapper[4865]: I1205 05:53:43.570266 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xhxgz"] Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.640755 4865 request.go:700] Waited for 1.051171116s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-dxzfq Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.643313 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.644231 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-tls\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.644304 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: E1205 05:53:44.645164 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.14512589 +0000 UTC m=+44.425137112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.698955 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-95g8c" event={"ID":"60c9fd82-77d0-4c86-9ff6-0489fbeab324","Type":"ContainerStarted","Data":"7f858414fb7bc848ed52a5e74c9069336572a12f51e4b9d37ab4058d8eb3d238"} Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.717020 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" event={"ID":"4ff00812-1a0c-4bbc-8222-d7765505af6b","Type":"ContainerStarted","Data":"c7314007fcf9afbdb17a0d3e04d7f10fe37c0bfc559874e550f9ee5b834f0ad4"} Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.727682 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" event={"ID":"bae52c2b-77f5-4196-804b-e72254ce6f24","Type":"ContainerStarted","Data":"1c18bda25e2dde05d443bb58709c9b51d88f181eca4ce59ad4289bf9317805db"} Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.733481 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" event={"ID":"d02c7cde-9b06-4366-9ad0-b46d403446b1","Type":"ContainerStarted","Data":"9021f55daaaf9a15d1a5dd272b867da5f58e75feaabfeeac79e293414eecb92a"} Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748253 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748399 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-metrics-certs\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748422 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdrmq\" (UniqueName: \"kubernetes.io/projected/c0cebc10-c0ad-419c-903c-341c516f1527-kube-api-access-sdrmq\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748459 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-bound-sa-token\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748477 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cebc10-c0ad-419c-903c-341c516f1527-config\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748506 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e31d52b-ab95-489d-bbff-34e4b0daf602-proxy-tls\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748552 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-config\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748575 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0cebc10-c0ad-419c-903c-341c516f1527-images\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748593 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c9c322-fb54-438d-a353-d8deaa07b4fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748608 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-trusted-ca\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748625 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wqh5\" (UniqueName: \"kubernetes.io/projected/d60e1629-83b6-4492-bd6c-c0ed90da02be-kube-api-access-7wqh5\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748656 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qhmr\" (UniqueName: \"kubernetes.io/projected/d871e232-b94a-4c90-b2c1-775f93eaa51e-kube-api-access-8qhmr\") pod \"dns-operator-744455d44c-vqhhk\" (UID: \"d871e232-b94a-4c90-b2c1-775f93eaa51e\") " pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748675 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed63d1de-99a4-4b79-9aee-49a5484d968f-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748713 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dbbc361-a522-4548-a14e-bdd061c7bc4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748733 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/277ec781-2698-4637-a619-e9bdc3fcabae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748754 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e31d52b-ab95-489d-bbff-34e4b0daf602-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748792 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c9c322-fb54-438d-a353-d8deaa07b4fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748805 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ec781-2698-4637-a619-e9bdc3fcabae-config\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.748931 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dbbc361-a522-4548-a14e-bdd061c7bc4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.755784 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-service-ca-bundle\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.755907 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-tls\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.755933 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-proxy-tls\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.755958 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-stats-auth\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.755978 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx7fs\" (UniqueName: \"kubernetes.io/projected/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-kube-api-access-vx7fs\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.756012 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d871e232-b94a-4c90-b2c1-775f93eaa51e-metrics-tls\") pod \"dns-operator-744455d44c-vqhhk\" (UID: \"d871e232-b94a-4c90-b2c1-775f93eaa51e\") " pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:44 crc kubenswrapper[4865]: E1205 05:53:44.757020 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.256998775 +0000 UTC m=+44.537009997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757235 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/277ec781-2698-4637-a619-e9bdc3fcabae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757274 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgcm\" (UniqueName: \"kubernetes.io/projected/910872c7-f875-4697-9a0a-3ae986b9fca7-kube-api-access-mqgcm\") pod \"migrator-59844c95c7-zjllp\" (UID: \"910872c7-f875-4697-9a0a-3ae986b9fca7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757317 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4551408-dc5b-41f2-b83f-db54412a6d25-signing-key\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757346 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-default-certificate\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757372 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-oauth-config\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757410 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbf6l\" (UniqueName: \"kubernetes.io/projected/7e31d52b-ab95-489d-bbff-34e4b0daf602-kube-api-access-tbf6l\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757437 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed63d1de-99a4-4b79-9aee-49a5484d968f-webhook-cert\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757815 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr6qv\" (UniqueName: \"kubernetes.io/projected/f4551408-dc5b-41f2-b83f-db54412a6d25-kube-api-access-mr6qv\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757950 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-certificates\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.757983 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60e1629-83b6-4492-bd6c-c0ed90da02be-config-volume\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.758078 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc2c\" (UniqueName: \"kubernetes.io/projected/55de6799-e76b-4493-a007-49cd203e7573-kube-api-access-xzc2c\") pod \"downloads-7954f5f757-d6z25\" (UID: \"55de6799-e76b-4493-a007-49cd203e7573\") " pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.759556 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwgk\" (UniqueName: \"kubernetes.io/projected/dff0db39-9f6f-4455-8cab-8d4cdce33b04-kube-api-access-bgwgk\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.759670 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed63d1de-99a4-4b79-9aee-49a5484d968f-tmpfs\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.759904 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2vv\" (UniqueName: \"kubernetes.io/projected/d4c0391a-598b-4504-9601-c1b362c3060c-kube-api-access-xg2vv\") pod \"multus-admission-controller-857f4d67dd-drrtb\" (UID: \"d4c0391a-598b-4504-9601-c1b362c3060c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.759984 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4sn\" (UniqueName: \"kubernetes.io/projected/ed63d1de-99a4-4b79-9aee-49a5484d968f-kube-api-access-lv4sn\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760147 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqw68\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-kube-api-access-qqw68\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760451 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760517 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4551408-dc5b-41f2-b83f-db54412a6d25-signing-cabundle\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760538 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-serving-cert\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760583 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-oauth-serving-cert\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760622 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9kkg\" (UniqueName: \"kubernetes.io/projected/51c9c322-fb54-438d-a353-d8deaa07b4fb-kube-api-access-g9kkg\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760656 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d4c0391a-598b-4504-9601-c1b362c3060c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-drrtb\" (UID: \"d4c0391a-598b-4504-9601-c1b362c3060c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.760979 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-trusted-ca-bundle\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.761040 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e31d52b-ab95-489d-bbff-34e4b0daf602-images\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: E1205 05:53:44.761050 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.26103618 +0000 UTC m=+44.541047402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.761073 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c0cebc10-c0ad-419c-903c-341c516f1527-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.761094 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-service-ca\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.761140 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.761179 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2fd\" (UniqueName: \"kubernetes.io/projected/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-kube-api-access-cl2fd\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.761198 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60e1629-83b6-4492-bd6c-c0ed90da02be-secret-volume\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.768844 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-tls\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.865526 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.865736 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9z2\" (UniqueName: \"kubernetes.io/projected/c3633127-3192-43e8-87a2-5049b2d82fa6-kube-api-access-9h9z2\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.865781 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e31d52b-ab95-489d-bbff-34e4b0daf602-proxy-tls\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.865804 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-config\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.866155 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0cebc10-c0ad-419c-903c-341c516f1527-images\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.866179 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfz2\" (UniqueName: \"kubernetes.io/projected/9813d882-ad86-4946-96ae-caa85c66aaab-kube-api-access-qgfz2\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.866200 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c9c322-fb54-438d-a353-d8deaa07b4fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.866219 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d930639e-351d-4089-88b9-966335507daf-node-bootstrap-token\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.866251 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-trusted-ca\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868533 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-config\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868595 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgqzz\" (UniqueName: \"kubernetes.io/projected/63297c46-fc31-471d-99dc-47352f52e76c-kube-api-access-bgqzz\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868647 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-plugins-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868672 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3633127-3192-43e8-87a2-5049b2d82fa6-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868726 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qhmr\" (UniqueName: \"kubernetes.io/projected/d871e232-b94a-4c90-b2c1-775f93eaa51e-kube-api-access-8qhmr\") pod \"dns-operator-744455d44c-vqhhk\" (UID: \"d871e232-b94a-4c90-b2c1-775f93eaa51e\") " pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868752 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wqh5\" (UniqueName: \"kubernetes.io/projected/d60e1629-83b6-4492-bd6c-c0ed90da02be-kube-api-access-7wqh5\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868780 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8b1868-5f2d-4c43-a63c-525760727b75-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-whlbb\" (UID: \"ab8b1868-5f2d-4c43-a63c-525760727b75\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868808 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjh4d\" (UniqueName: \"kubernetes.io/projected/d930639e-351d-4089-88b9-966335507daf-kube-api-access-tjh4d\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868850 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dbbc361-a522-4548-a14e-bdd061c7bc4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868870 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/277ec781-2698-4637-a619-e9bdc3fcabae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868893 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed63d1de-99a4-4b79-9aee-49a5484d968f-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868952 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e31d52b-ab95-489d-bbff-34e4b0daf602-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.868975 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45637469-1150-4175-a642-a33eeb1c7a9d-serving-cert\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869025 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c3633127-3192-43e8-87a2-5049b2d82fa6-ready\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869044 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dbbc361-a522-4548-a14e-bdd061c7bc4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869063 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-service-ca-bundle\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869082 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c9c322-fb54-438d-a353-d8deaa07b4fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869100 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ec781-2698-4637-a619-e9bdc3fcabae-config\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869135 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-proxy-tls\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869157 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3633127-3192-43e8-87a2-5049b2d82fa6-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869178 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-stats-auth\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869198 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx7fs\" (UniqueName: \"kubernetes.io/projected/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-kube-api-access-vx7fs\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869217 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9813d882-ad86-4946-96ae-caa85c66aaab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869250 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d871e232-b94a-4c90-b2c1-775f93eaa51e-metrics-tls\") pod \"dns-operator-744455d44c-vqhhk\" (UID: \"d871e232-b94a-4c90-b2c1-775f93eaa51e\") " pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869271 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-registration-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869289 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/277ec781-2698-4637-a619-e9bdc3fcabae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869309 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgcm\" (UniqueName: \"kubernetes.io/projected/910872c7-f875-4697-9a0a-3ae986b9fca7-kube-api-access-mqgcm\") pod \"migrator-59844c95c7-zjllp\" (UID: \"910872c7-f875-4697-9a0a-3ae986b9fca7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869327 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869347 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c-cert\") pod \"ingress-canary-7lcjb\" (UID: \"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c\") " pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869367 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4551408-dc5b-41f2-b83f-db54412a6d25-signing-key\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869385 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869405 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-default-certificate\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869425 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-oauth-config\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869448 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbf6l\" (UniqueName: \"kubernetes.io/projected/7e31d52b-ab95-489d-bbff-34e4b0daf602-kube-api-access-tbf6l\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869465 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-csi-data-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869483 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed63d1de-99a4-4b79-9aee-49a5484d968f-webhook-cert\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869498 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d930639e-351d-4089-88b9-966335507daf-certs\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869513 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45637469-1150-4175-a642-a33eeb1c7a9d-config\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869538 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-certificates\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869579 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr6qv\" (UniqueName: \"kubernetes.io/projected/f4551408-dc5b-41f2-b83f-db54412a6d25-kube-api-access-mr6qv\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869598 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60e1629-83b6-4492-bd6c-c0ed90da02be-config-volume\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869618 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc2c\" (UniqueName: \"kubernetes.io/projected/55de6799-e76b-4493-a007-49cd203e7573-kube-api-access-xzc2c\") pod \"downloads-7954f5f757-d6z25\" (UID: \"55de6799-e76b-4493-a007-49cd203e7573\") " pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869636 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8f99\" (UniqueName: \"kubernetes.io/projected/cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c-kube-api-access-l8f99\") pod \"ingress-canary-7lcjb\" (UID: \"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c\") " pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869653 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5sxd\" (UniqueName: \"kubernetes.io/projected/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-kube-api-access-f5sxd\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869726 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwgk\" (UniqueName: \"kubernetes.io/projected/dff0db39-9f6f-4455-8cab-8d4cdce33b04-kube-api-access-bgwgk\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869744 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-socket-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869777 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5293d191-528f-4818-b897-11bb456c2b50-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fmdqp\" (UID: \"5293d191-528f-4818-b897-11bb456c2b50\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869790 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c9c322-fb54-438d-a353-d8deaa07b4fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869798 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed63d1de-99a4-4b79-9aee-49a5484d968f-tmpfs\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869903 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9813d882-ad86-4946-96ae-caa85c66aaab-srv-cert\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869948 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2vv\" (UniqueName: \"kubernetes.io/projected/d4c0391a-598b-4504-9601-c1b362c3060c-kube-api-access-xg2vv\") pod \"multus-admission-controller-857f4d67dd-drrtb\" (UID: \"d4c0391a-598b-4504-9601-c1b362c3060c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869968 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.869985 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tt8j\" (UniqueName: \"kubernetes.io/projected/45637469-1150-4175-a642-a33eeb1c7a9d-kube-api-access-8tt8j\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870027 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv4sn\" (UniqueName: \"kubernetes.io/projected/ed63d1de-99a4-4b79-9aee-49a5484d968f-kube-api-access-lv4sn\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870045 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-metrics-tls\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870082 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqw68\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-kube-api-access-qqw68\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870112 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870126 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed63d1de-99a4-4b79-9aee-49a5484d968f-tmpfs\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870143 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph2jj\" (UniqueName: \"kubernetes.io/projected/1f49a368-065d-4057-a044-a019eba9ce9e-kube-api-access-ph2jj\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:44 crc kubenswrapper[4865]: E1205 05:53:44.870185 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.370168977 +0000 UTC m=+44.650180199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870216 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870237 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4551408-dc5b-41f2-b83f-db54412a6d25-signing-cabundle\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870256 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-serving-cert\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870277 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkssp\" (UniqueName: \"kubernetes.io/projected/ab8b1868-5f2d-4c43-a63c-525760727b75-kube-api-access-kkssp\") pod \"package-server-manager-789f6589d5-whlbb\" (UID: \"ab8b1868-5f2d-4c43-a63c-525760727b75\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.870296 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-config-volume\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.872128 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0cebc10-c0ad-419c-903c-341c516f1527-images\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.872172 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-oauth-serving-cert\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.873852 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-oauth-serving-cert\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.874962 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9kkg\" (UniqueName: \"kubernetes.io/projected/51c9c322-fb54-438d-a353-d8deaa07b4fb-kube-api-access-g9kkg\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875060 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gtr\" (UniqueName: \"kubernetes.io/projected/5293d191-528f-4818-b897-11bb456c2b50-kube-api-access-v2gtr\") pod \"control-plane-machine-set-operator-78cbb6b69f-fmdqp\" (UID: \"5293d191-528f-4818-b897-11bb456c2b50\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875092 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d4c0391a-598b-4504-9601-c1b362c3060c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-drrtb\" (UID: \"d4c0391a-598b-4504-9601-c1b362c3060c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875167 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-mountpoint-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875230 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-trusted-ca-bundle\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875297 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e31d52b-ab95-489d-bbff-34e4b0daf602-images\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875318 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c0cebc10-c0ad-419c-903c-341c516f1527-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875351 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-service-ca\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875374 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875403 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60e1629-83b6-4492-bd6c-c0ed90da02be-secret-volume\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875441 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2fd\" (UniqueName: \"kubernetes.io/projected/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-kube-api-access-cl2fd\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875477 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875517 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdrmq\" (UniqueName: \"kubernetes.io/projected/c0cebc10-c0ad-419c-903c-341c516f1527-kube-api-access-sdrmq\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875534 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-metrics-certs\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875653 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-trusted-ca\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875664 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-bound-sa-token\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.875713 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cebc10-c0ad-419c-903c-341c516f1527-config\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.876143 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.876801 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0cebc10-c0ad-419c-903c-341c516f1527-config\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.877421 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-certificates\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.883159 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-oauth-config\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.883433 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.886053 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60e1629-83b6-4492-bd6c-c0ed90da02be-config-volume\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.886151 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-trusted-ca-bundle\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.886511 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dbbc361-a522-4548-a14e-bdd061c7bc4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.888785 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ec781-2698-4637-a619-e9bdc3fcabae-config\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.889622 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-service-ca-bundle\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: E1205 05:53:44.901139 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.401125358 +0000 UTC m=+44.681136580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.908040 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4551408-dc5b-41f2-b83f-db54412a6d25-signing-cabundle\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.910861 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e31d52b-ab95-489d-bbff-34e4b0daf602-images\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.914333 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dbbc361-a522-4548-a14e-bdd061c7bc4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.915076 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c9c322-fb54-438d-a353-d8deaa07b4fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.915139 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60e1629-83b6-4492-bd6c-c0ed90da02be-secret-volume\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.916396 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-stats-auth\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.917122 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e31d52b-ab95-489d-bbff-34e4b0daf602-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.919431 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/277ec781-2698-4637-a619-e9bdc3fcabae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.919744 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed63d1de-99a4-4b79-9aee-49a5484d968f-webhook-cert\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.923187 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-proxy-tls\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.923945 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-default-certificate\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.924427 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d4c0391a-598b-4504-9601-c1b362c3060c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-drrtb\" (UID: \"d4c0391a-598b-4504-9601-c1b362c3060c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.924907 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e31d52b-ab95-489d-bbff-34e4b0daf602-proxy-tls\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.926514 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-metrics-certs\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.937626 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/397265e2-4c59-42f5-8774-357048ba57ac-metrics-certs\") pod \"network-metrics-daemon-j8p6s\" (UID: \"397265e2-4c59-42f5-8774-357048ba57ac\") " pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.938471 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4551408-dc5b-41f2-b83f-db54412a6d25-signing-key\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.939283 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-serving-cert\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.939699 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-service-ca\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.941737 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2fd\" (UniqueName: \"kubernetes.io/projected/26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8-kube-api-access-cl2fd\") pod \"machine-config-controller-84d6567774-l4sds\" (UID: \"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.942735 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx7fs\" (UniqueName: \"kubernetes.io/projected/6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a-kube-api-access-vx7fs\") pod \"router-default-5444994796-h6t2z\" (UID: \"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a\") " pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.943306 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.944235 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgcm\" (UniqueName: \"kubernetes.io/projected/910872c7-f875-4697-9a0a-3ae986b9fca7-kube-api-access-mqgcm\") pod \"migrator-59844c95c7-zjllp\" (UID: \"910872c7-f875-4697-9a0a-3ae986b9fca7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.944584 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/277ec781-2698-4637-a619-e9bdc3fcabae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qvs5j\" (UID: \"277ec781-2698-4637-a619-e9bdc3fcabae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.944677 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2vv\" (UniqueName: \"kubernetes.io/projected/d4c0391a-598b-4504-9601-c1b362c3060c-kube-api-access-xg2vv\") pod \"multus-admission-controller-857f4d67dd-drrtb\" (UID: \"d4c0391a-598b-4504-9601-c1b362c3060c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.946929 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed63d1de-99a4-4b79-9aee-49a5484d968f-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.947693 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c0cebc10-c0ad-419c-903c-341c516f1527-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.948086 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwgk\" (UniqueName: \"kubernetes.io/projected/dff0db39-9f6f-4455-8cab-8d4cdce33b04-kube-api-access-bgwgk\") pod \"console-f9d7485db-lclsv\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.948163 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d871e232-b94a-4c90-b2c1-775f93eaa51e-metrics-tls\") pod \"dns-operator-744455d44c-vqhhk\" (UID: \"d871e232-b94a-4c90-b2c1-775f93eaa51e\") " pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.950009 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wqh5\" (UniqueName: \"kubernetes.io/projected/d60e1629-83b6-4492-bd6c-c0ed90da02be-kube-api-access-7wqh5\") pod \"collect-profiles-29415225-7pnfb\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.950373 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9kkg\" (UniqueName: \"kubernetes.io/projected/51c9c322-fb54-438d-a353-d8deaa07b4fb-kube-api-access-g9kkg\") pod \"kube-storage-version-migrator-operator-b67b599dd-kxq8c\" (UID: \"51c9c322-fb54-438d-a353-d8deaa07b4fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.951595 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-bound-sa-token\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.953139 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr6qv\" (UniqueName: \"kubernetes.io/projected/f4551408-dc5b-41f2-b83f-db54412a6d25-kube-api-access-mr6qv\") pod \"service-ca-9c57cc56f-cl4kt\" (UID: \"f4551408-dc5b-41f2-b83f-db54412a6d25\") " pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.953531 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qhmr\" (UniqueName: \"kubernetes.io/projected/d871e232-b94a-4c90-b2c1-775f93eaa51e-kube-api-access-8qhmr\") pod \"dns-operator-744455d44c-vqhhk\" (UID: \"d871e232-b94a-4c90-b2c1-775f93eaa51e\") " pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.954749 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc2c\" (UniqueName: \"kubernetes.io/projected/55de6799-e76b-4493-a007-49cd203e7573-kube-api-access-xzc2c\") pod \"downloads-7954f5f757-d6z25\" (UID: \"55de6799-e76b-4493-a007-49cd203e7573\") " pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.955785 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqw68\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-kube-api-access-qqw68\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.973294 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbf6l\" (UniqueName: \"kubernetes.io/projected/7e31d52b-ab95-489d-bbff-34e4b0daf602-kube-api-access-tbf6l\") pod \"machine-config-operator-74547568cd-zbsvt\" (UID: \"7e31d52b-ab95-489d-bbff-34e4b0daf602\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.973650 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.978448 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv4sn\" (UniqueName: \"kubernetes.io/projected/ed63d1de-99a4-4b79-9aee-49a5484d968f-kube-api-access-lv4sn\") pod \"packageserver-d55dfcdfc-7jm2h\" (UID: \"ed63d1de-99a4-4b79-9aee-49a5484d968f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.978627 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979362 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979512 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9813d882-ad86-4946-96ae-caa85c66aaab-srv-cert\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979541 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tt8j\" (UniqueName: \"kubernetes.io/projected/45637469-1150-4175-a642-a33eeb1c7a9d-kube-api-access-8tt8j\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979563 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979584 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-metrics-tls\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979605 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979628 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph2jj\" (UniqueName: \"kubernetes.io/projected/1f49a368-065d-4057-a044-a019eba9ce9e-kube-api-access-ph2jj\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979664 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkssp\" (UniqueName: \"kubernetes.io/projected/ab8b1868-5f2d-4c43-a63c-525760727b75-kube-api-access-kkssp\") pod \"package-server-manager-789f6589d5-whlbb\" (UID: \"ab8b1868-5f2d-4c43-a63c-525760727b75\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979692 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-config-volume\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979723 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gtr\" (UniqueName: \"kubernetes.io/projected/5293d191-528f-4818-b897-11bb456c2b50-kube-api-access-v2gtr\") pod \"control-plane-machine-set-operator-78cbb6b69f-fmdqp\" (UID: \"5293d191-528f-4818-b897-11bb456c2b50\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979742 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-mountpoint-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979800 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979819 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9z2\" (UniqueName: \"kubernetes.io/projected/c3633127-3192-43e8-87a2-5049b2d82fa6-kube-api-access-9h9z2\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979858 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfz2\" (UniqueName: \"kubernetes.io/projected/9813d882-ad86-4946-96ae-caa85c66aaab-kube-api-access-qgfz2\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979876 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d930639e-351d-4089-88b9-966335507daf-node-bootstrap-token\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979896 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgqzz\" (UniqueName: \"kubernetes.io/projected/63297c46-fc31-471d-99dc-47352f52e76c-kube-api-access-bgqzz\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979913 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-plugins-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979930 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3633127-3192-43e8-87a2-5049b2d82fa6-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979950 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8b1868-5f2d-4c43-a63c-525760727b75-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-whlbb\" (UID: \"ab8b1868-5f2d-4c43-a63c-525760727b75\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979969 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjh4d\" (UniqueName: \"kubernetes.io/projected/d930639e-351d-4089-88b9-966335507daf-kube-api-access-tjh4d\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.979993 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45637469-1150-4175-a642-a33eeb1c7a9d-serving-cert\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980016 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c3633127-3192-43e8-87a2-5049b2d82fa6-ready\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980035 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3633127-3192-43e8-87a2-5049b2d82fa6-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980052 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9813d882-ad86-4946-96ae-caa85c66aaab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980070 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-registration-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980085 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980101 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c-cert\") pod \"ingress-canary-7lcjb\" (UID: \"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c\") " pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980115 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980133 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-csi-data-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980147 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d930639e-351d-4089-88b9-966335507daf-certs\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980163 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45637469-1150-4175-a642-a33eeb1c7a9d-config\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980160 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-mountpoint-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980184 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8f99\" (UniqueName: \"kubernetes.io/projected/cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c-kube-api-access-l8f99\") pod \"ingress-canary-7lcjb\" (UID: \"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c\") " pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980200 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5sxd\" (UniqueName: \"kubernetes.io/projected/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-kube-api-access-f5sxd\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980217 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-socket-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.980237 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5293d191-528f-4818-b897-11bb456c2b50-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fmdqp\" (UID: \"5293d191-528f-4818-b897-11bb456c2b50\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:44 crc kubenswrapper[4865]: E1205 05:53:44.980626 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.480606661 +0000 UTC m=+44.760617883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.981424 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3633127-3192-43e8-87a2-5049b2d82fa6-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.982394 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.982566 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-csi-data-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.983803 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-plugins-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.984154 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3633127-3192-43e8-87a2-5049b2d82fa6-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.984428 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-config-volume\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.984906 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c3633127-3192-43e8-87a2-5049b2d82fa6-ready\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.991184 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdrmq\" (UniqueName: \"kubernetes.io/projected/c0cebc10-c0ad-419c-903c-341c516f1527-kube-api-access-sdrmq\") pod \"machine-api-operator-5694c8668f-v2jhh\" (UID: \"c0cebc10-c0ad-419c-903c-341c516f1527\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:44 crc kubenswrapper[4865]: I1205 05:53:44.991523 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.003167 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-registration-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.008371 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63297c46-fc31-471d-99dc-47352f52e76c-socket-dir\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.008777 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.015422 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d930639e-351d-4089-88b9-966335507daf-node-bootstrap-token\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.015859 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-metrics-tls\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.017045 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrb2j"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.017524 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.018042 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9813d882-ad86-4946-96ae-caa85c66aaab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.026745 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9z2\" (UniqueName: \"kubernetes.io/projected/c3633127-3192-43e8-87a2-5049b2d82fa6-kube-api-access-9h9z2\") pod \"cni-sysctl-allowlist-ds-k9x8b\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.027108 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5293d191-528f-4818-b897-11bb456c2b50-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fmdqp\" (UID: \"5293d191-528f-4818-b897-11bb456c2b50\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.027311 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45637469-1150-4175-a642-a33eeb1c7a9d-config\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.027718 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8b1868-5f2d-4c43-a63c-525760727b75-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-whlbb\" (UID: \"ab8b1868-5f2d-4c43-a63c-525760727b75\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.028441 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9813d882-ad86-4946-96ae-caa85c66aaab-srv-cert\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.029956 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45637469-1150-4175-a642-a33eeb1c7a9d-serving-cert\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.030340 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8p6s" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.033421 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph2jj\" (UniqueName: \"kubernetes.io/projected/1f49a368-065d-4057-a044-a019eba9ce9e-kube-api-access-ph2jj\") pod \"marketplace-operator-79b997595-8thws\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.033946 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjh4d\" (UniqueName: \"kubernetes.io/projected/d930639e-351d-4089-88b9-966335507daf-kube-api-access-tjh4d\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.045421 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c-cert\") pod \"ingress-canary-7lcjb\" (UID: \"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c\") " pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.056934 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gtr\" (UniqueName: \"kubernetes.io/projected/5293d191-528f-4818-b897-11bb456c2b50-kube-api-access-v2gtr\") pod \"control-plane-machine-set-operator-78cbb6b69f-fmdqp\" (UID: \"5293d191-528f-4818-b897-11bb456c2b50\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.057758 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkssp\" (UniqueName: \"kubernetes.io/projected/ab8b1868-5f2d-4c43-a63c-525760727b75-kube-api-access-kkssp\") pod \"package-server-manager-789f6589d5-whlbb\" (UID: \"ab8b1868-5f2d-4c43-a63c-525760727b75\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.058573 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8f99\" (UniqueName: \"kubernetes.io/projected/cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c-kube-api-access-l8f99\") pod \"ingress-canary-7lcjb\" (UID: \"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c\") " pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.059358 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfz2\" (UniqueName: \"kubernetes.io/projected/9813d882-ad86-4946-96ae-caa85c66aaab-kube-api-access-qgfz2\") pod \"olm-operator-6b444d44fb-h4jvf\" (UID: \"9813d882-ad86-4946-96ae-caa85c66aaab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.059730 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2c86fc9-df64-415a-bfbe-8a6049dc4d55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p8dtw\" (UID: \"d2c86fc9-df64-415a-bfbe-8a6049dc4d55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.063362 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.063625 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tt8j\" (UniqueName: \"kubernetes.io/projected/45637469-1150-4175-a642-a33eeb1c7a9d-kube-api-access-8tt8j\") pod \"service-ca-operator-777779d784-hgrgd\" (UID: \"45637469-1150-4175-a642-a33eeb1c7a9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.070561 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5sxd\" (UniqueName: \"kubernetes.io/projected/0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5-kube-api-access-f5sxd\") pod \"dns-default-ps4tl\" (UID: \"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5\") " pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.076330 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d930639e-351d-4089-88b9-966335507daf-certs\") pod \"machine-config-server-26v4f\" (UID: \"d930639e-351d-4089-88b9-966335507daf\") " pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.081356 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgqzz\" (UniqueName: \"kubernetes.io/projected/63297c46-fc31-471d-99dc-47352f52e76c-kube-api-access-bgqzz\") pod \"csi-hostpathplugin-wsxc7\" (UID: \"63297c46-fc31-471d-99dc-47352f52e76c\") " pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.082591 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.083146 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.583133479 +0000 UTC m=+44.863144691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.094210 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.098581 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.104253 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.114487 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.160201 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.176331 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.183745 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.184225 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.684204577 +0000 UTC m=+44.964215799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.193062 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.214466 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.221482 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.236950 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.243296 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.255236 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.283021 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82xnx"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.286491 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.287153 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.287513 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.787499467 +0000 UTC m=+45.067510689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.295271 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.303120 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.315781 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.327962 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.328409 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.347107 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7lcjb" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.354134 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.368636 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-26v4f" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.377751 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.388939 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.389314 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.889298646 +0000 UTC m=+45.169309868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.428776 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.434113 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.468903 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.490104 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.490393 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:45.990382303 +0000 UTC m=+45.270393525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.491447 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" podStartSLOduration=20.491433743 podStartE2EDuration="20.491433743s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:45.489382935 +0000 UTC m=+44.769394157" watchObservedRunningTime="2025-12-05 05:53:45.491433743 +0000 UTC m=+44.771444965" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.590942 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.591307 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.091254315 +0000 UTC m=+45.371265537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.648796 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.693481 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.693918 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.193903068 +0000 UTC m=+45.473914290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.722746 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cjczz"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.727021 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds"] Dec 05 05:53:45 crc kubenswrapper[4865]: W1205 05:53:45.762574 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a712fd8_b8cb_4ac2_955d_b19e4cd36c7a.slice/crio-ccae23c2151b63a4ef40daefcfe51354aaf261169688edfd897f06b7f0a130a5 WatchSource:0}: Error finding container ccae23c2151b63a4ef40daefcfe51354aaf261169688edfd897f06b7f0a130a5: Status 404 returned error can't find the container with id ccae23c2151b63a4ef40daefcfe51354aaf261169688edfd897f06b7f0a130a5 Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.791299 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" event={"ID":"8cc2c35c-7bb7-4475-a318-0133139b9359","Type":"ContainerStarted","Data":"8a3eb92fc43cdedd2eae21b064b17ef9011708e3c915217dd10888e11a0912ed"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.794432 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.794953 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.294935544 +0000 UTC m=+45.574946766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.799221 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" event={"ID":"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a","Type":"ContainerStarted","Data":"ac26de4fcc45956785bcc6d6706b67d4f77ba8160e726848e5899d9b08871dd1"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.799254 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" event={"ID":"5f6c0508-ee1a-4224-bdc6-f6f7af78d75a","Type":"ContainerStarted","Data":"065db5994e803fea03bc2b8c62fd5d32dddaf961e555a585a3bb793097df1b94"} Dec 05 05:53:45 crc kubenswrapper[4865]: W1205 05:53:45.807186 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c8d7ff_d655_4d40_b99c_e11daec5b263.slice/crio-851e4ca31d2999ab5ac8b506c7ecad26c3558c3dbf09d919c6909dcbf5e95009 WatchSource:0}: Error finding container 851e4ca31d2999ab5ac8b506c7ecad26c3558c3dbf09d919c6909dcbf5e95009: Status 404 returned error can't find the container with id 851e4ca31d2999ab5ac8b506c7ecad26c3558c3dbf09d919c6909dcbf5e95009 Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.828345 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cl4kt"] Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.832435 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" event={"ID":"d3782972-9e30-42ae-9af7-9cba4bdcbfe3","Type":"ContainerStarted","Data":"f1fcadbd649ad167ebae1d59d3d28280eab2f959f8b7d3b24d644090e4a5a353"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.832509 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" event={"ID":"d3782972-9e30-42ae-9af7-9cba4bdcbfe3","Type":"ContainerStarted","Data":"77b66b1406b8275befce0c5b2cfe1d9d5f067210c9a0ae6d427da5ccf844e66b"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.834661 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" event={"ID":"d5a2cf4e-5936-433e-ae66-9906820ebe95","Type":"ContainerStarted","Data":"953664dbd1402aa117441059f6d2c6178cbe040cc8a0c5648f6181c2d3b666db"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.849536 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" event={"ID":"cc594832-ffe0-4764-95ac-f62739f0314e","Type":"ContainerStarted","Data":"7bf29d6e8626012ab7369fff203100cbe7523317c7c329f7045b48a7bf5c31dd"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.866387 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" event={"ID":"445165fe-92bf-4fb9-9f6e-4f8101278621","Type":"ContainerStarted","Data":"81902f3386e7d0d8d39770cedb589577a569acc3b6cdcdecf3e3c069488dffb3"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.866921 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xhxgz" podStartSLOduration=20.866905563 podStartE2EDuration="20.866905563s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:45.862511568 +0000 UTC m=+45.142522810" watchObservedRunningTime="2025-12-05 05:53:45.866905563 +0000 UTC m=+45.146916785" Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.891548 4865 generic.go:334] "Generic (PLEG): container finished" podID="320fa4b0-5e00-4bca-b8f2-1afa7387d156" containerID="3d22f94563a233e7f8888a5f0e0ec08f7924ce28d54ab1595f81c57c872df133" exitCode=0 Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.891634 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" event={"ID":"320fa4b0-5e00-4bca-b8f2-1afa7387d156","Type":"ContainerDied","Data":"3d22f94563a233e7f8888a5f0e0ec08f7924ce28d54ab1595f81c57c872df133"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.896812 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.898612 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.398599805 +0000 UTC m=+45.678611027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.912682 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h6t2z" event={"ID":"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a","Type":"ContainerStarted","Data":"ccae23c2151b63a4ef40daefcfe51354aaf261169688edfd897f06b7f0a130a5"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.927691 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z77ns" event={"ID":"dabab9ed-be46-4cbf-a0e5-8e3679b3b434","Type":"ContainerStarted","Data":"338e86c1166ca284e27209069cd94c124a12594c3ae54889f92c17c5f4188a46"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.946241 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" event={"ID":"87a82cae-057c-47d5-9703-eb48128e1bd9","Type":"ContainerStarted","Data":"ae12a8c637b786de5c1a85a635492bef0131458baa06c17017a905ebce3f328f"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.997262 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" event={"ID":"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00","Type":"ContainerStarted","Data":"1aee1230e159b550d6d549ccc531241596efa5db31ddf1d231f925fcb679d5a1"} Dec 05 05:53:45 crc kubenswrapper[4865]: I1205 05:53:45.998318 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:45 crc kubenswrapper[4865]: E1205 05:53:45.999605 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.499387014 +0000 UTC m=+45.779398236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.053645 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt"] Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.087002 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" event={"ID":"3f657509-34dd-4ea9-84fb-ce4548fde2f3","Type":"ContainerStarted","Data":"81e41942b66c345debfe29b61e25de2a9f388894426eac7d61b09ae5339ac701"} Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.104386 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.104716 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.604702832 +0000 UTC m=+45.884714054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.119964 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.137501 4865 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-k5gw9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.137547 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" podUID="4ff00812-1a0c-4bbc-8222-d7765505af6b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.137839 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-95g8c" event={"ID":"60c9fd82-77d0-4c86-9ff6-0489fbeab324","Type":"ContainerStarted","Data":"b536216b417e7eea7f9011c3b1f6c546c14777d35d896222b16f131267d39d84"} Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.138893 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.143542 4865 patch_prober.go:28] interesting pod/console-operator-58897d9998-95g8c container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.143594 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-95g8c" podUID="60c9fd82-77d0-4c86-9ff6-0489fbeab324" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.176483 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" podStartSLOduration=21.176468055 podStartE2EDuration="21.176468055s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:46.175209969 +0000 UTC m=+45.455221191" watchObservedRunningTime="2025-12-05 05:53:46.176468055 +0000 UTC m=+45.456479277" Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.177643 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j8p6s"] Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.180154 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" event={"ID":"bae52c2b-77f5-4196-804b-e72254ce6f24","Type":"ContainerStarted","Data":"0b72a629d564f40b8dd0b2680e103c7925160e9dc7bd0de55dc9b955c3f0334e"} Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.205780 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.210188 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-95g8c" podStartSLOduration=21.210172105 podStartE2EDuration="21.210172105s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:46.207521619 +0000 UTC m=+45.487532841" watchObservedRunningTime="2025-12-05 05:53:46.210172105 +0000 UTC m=+45.490183327" Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.210572 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.710559216 +0000 UTC m=+45.990570518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.234134 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5vtbs" podStartSLOduration=21.234113326 podStartE2EDuration="21.234113326s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:46.231757419 +0000 UTC m=+45.511768641" watchObservedRunningTime="2025-12-05 05:53:46.234113326 +0000 UTC m=+45.514124548" Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.309585 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.311497 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.811483449 +0000 UTC m=+46.091494671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.412975 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.413372 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:46.913358403 +0000 UTC m=+46.193369625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.521949 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.522261 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.022243532 +0000 UTC m=+46.302254754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.623102 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.634399 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.124805733 +0000 UTC m=+46.404816945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: W1205 05:53:46.674225 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397265e2_4c59_42f5_8774_357048ba57ac.slice/crio-34e8f223893bdde6549d0bb3b4552fab397f9a410f8daa251066d7c9c14a9f88 WatchSource:0}: Error finding container 34e8f223893bdde6549d0bb3b4552fab397f9a410f8daa251066d7c9c14a9f88: Status 404 returned error can't find the container with id 34e8f223893bdde6549d0bb3b4552fab397f9a410f8daa251066d7c9c14a9f88 Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.726649 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.727037 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.227022935 +0000 UTC m=+46.507034157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.751804 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h"] Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.828897 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.829168 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.329153293 +0000 UTC m=+46.609164515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.891809 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lclsv"] Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.931215 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:46 crc kubenswrapper[4865]: E1205 05:53:46.931631 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.431619121 +0000 UTC m=+46.711630343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:46 crc kubenswrapper[4865]: I1205 05:53:46.933413 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v2jhh"] Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.039676 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.040014 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.539988686 +0000 UTC m=+46.819999898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.040119 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.040376 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.540370166 +0000 UTC m=+46.820381388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: W1205 05:53:47.067846 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0cebc10_c0ad_419c_903c_341c516f1527.slice/crio-58385acc07f25761d77892108cec6fe3b325830165d67163f5687e0ec63248b7 WatchSource:0}: Error finding container 58385acc07f25761d77892108cec6fe3b325830165d67163f5687e0ec63248b7: Status 404 returned error can't find the container with id 58385acc07f25761d77892108cec6fe3b325830165d67163f5687e0ec63248b7 Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.068661 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.101952 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.141111 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.141487 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.641471276 +0000 UTC m=+46.921482498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.282770 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.283190 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.783178468 +0000 UTC m=+47.063189690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.375161 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" event={"ID":"26c8d7ff-d655-4d40-b99c-e11daec5b263","Type":"ContainerStarted","Data":"851e4ca31d2999ab5ac8b506c7ecad26c3558c3dbf09d919c6909dcbf5e95009"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.378479 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" event={"ID":"4ff00812-1a0c-4bbc-8222-d7765505af6b","Type":"ContainerStarted","Data":"9a3faeca9dc31aa6a02c4a07a9d9ae868dc05e51758895ac25b80a3ee2b6d5ca"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.390223 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.390610 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.890596556 +0000 UTC m=+47.170607778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.390729 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" event={"ID":"d5a2cf4e-5936-433e-ae66-9906820ebe95","Type":"ContainerStarted","Data":"de42271794a06ae00cb7aa4070731e3a86a6c0c51ca0079231bc44aa0b8e7f09"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.391164 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.399481 4865 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bg82r container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.399533 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" podUID="d5a2cf4e-5936-433e-ae66-9906820ebe95" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.404729 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" event={"ID":"f4551408-dc5b-41f2-b83f-db54412a6d25","Type":"ContainerStarted","Data":"f3f2659d9fd2118ef090adf9c4749d7d0e36745fe862a2c25d1fd87f60737bf2"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.404786 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" event={"ID":"f4551408-dc5b-41f2-b83f-db54412a6d25","Type":"ContainerStarted","Data":"f7b4778910e324ce27d246a46cf95f135226b1b3057b4be03feed73fc520a134"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.426256 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" event={"ID":"c0cebc10-c0ad-419c-903c-341c516f1527","Type":"ContainerStarted","Data":"58385acc07f25761d77892108cec6fe3b325830165d67163f5687e0ec63248b7"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.438906 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j8p6s" event={"ID":"397265e2-4c59-42f5-8774-357048ba57ac","Type":"ContainerStarted","Data":"34e8f223893bdde6549d0bb3b4552fab397f9a410f8daa251066d7c9c14a9f88"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.441106 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.441094959 podStartE2EDuration="441.094959ms" podCreationTimestamp="2025-12-05 05:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:47.439113423 +0000 UTC m=+46.719124645" watchObservedRunningTime="2025-12-05 05:53:47.441094959 +0000 UTC m=+46.721106181" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.463010 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" event={"ID":"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8","Type":"ContainerStarted","Data":"acac88242aa147ea496ff8b4f49e6d6305c12927fc7c785087a9d3b3b872aff4"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.467644 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" podStartSLOduration=22.467622092 podStartE2EDuration="22.467622092s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:47.463751813 +0000 UTC m=+46.743763035" watchObservedRunningTime="2025-12-05 05:53:47.467622092 +0000 UTC m=+46.747633314" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.480875 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" event={"ID":"7e31d52b-ab95-489d-bbff-34e4b0daf602","Type":"ContainerStarted","Data":"5bf594468b1bc477c26e023cf0fa8a4855a4e3e0680fd6f7fb63203aaa8e9d07"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.492315 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.494410 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:47.994399102 +0000 UTC m=+47.274410324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.502668 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" event={"ID":"3f657509-34dd-4ea9-84fb-ce4548fde2f3","Type":"ContainerStarted","Data":"e2a3031bc3ac4a9db1b8c74635bdca471a5ec01295d38b07f5d6216a53e860fb"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.566743 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" event={"ID":"d3782972-9e30-42ae-9af7-9cba4bdcbfe3","Type":"ContainerStarted","Data":"409d791b06605d47a0d5e64dcd03e86a5ed272be07470a7c55a9d2e28118e816"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.573467 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-26v4f" event={"ID":"d930639e-351d-4089-88b9-966335507daf","Type":"ContainerStarted","Data":"d31342f4c47875ac0f8e4e21198c0e085f9838b92f19e95ebbfe297f1396e7fe"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.581204 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lclsv" event={"ID":"dff0db39-9f6f-4455-8cab-8d4cdce33b04","Type":"ContainerStarted","Data":"1568b97006d42371f19f0b190c8cdc35d7f16852dd89d4843c36dfc21fe5c6eb"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.584661 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-cl4kt" podStartSLOduration=22.584645073 podStartE2EDuration="22.584645073s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:47.511034134 +0000 UTC m=+46.791045356" watchObservedRunningTime="2025-12-05 05:53:47.584645073 +0000 UTC m=+46.864656295" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.592905 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbmfc" podStartSLOduration=22.592879257 podStartE2EDuration="22.592879257s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:47.583696806 +0000 UTC m=+46.863708018" watchObservedRunningTime="2025-12-05 05:53:47.592879257 +0000 UTC m=+46.872890489" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.593312 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wsxc7"] Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.593919 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.594344 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.094305078 +0000 UTC m=+47.374316300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.618292 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" event={"ID":"445165fe-92bf-4fb9-9f6e-4f8101278621","Type":"ContainerStarted","Data":"f2ed83ff9f0f176c50fce55d84607d8e12daef470b73a04c996595e222383a96"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.637405 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.646364 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" event={"ID":"ed63d1de-99a4-4b79-9aee-49a5484d968f","Type":"ContainerStarted","Data":"c6ad6ea25c507af9f7af7c7ddbafdb240397f1b4723fb20098b83cac6289d2b8"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.651437 4865 generic.go:334] "Generic (PLEG): container finished" podID="d02c7cde-9b06-4366-9ad0-b46d403446b1" containerID="e015d8761f97d73128321b921bca50f2ff06620d7490f94275bb4188a9ac57a5" exitCode=0 Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.651515 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" event={"ID":"d02c7cde-9b06-4366-9ad0-b46d403446b1","Type":"ContainerDied","Data":"e015d8761f97d73128321b921bca50f2ff06620d7490f94275bb4188a9ac57a5"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.655028 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rp2kv" podStartSLOduration=22.6550079 podStartE2EDuration="22.6550079s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:47.645411088 +0000 UTC m=+46.925422310" watchObservedRunningTime="2025-12-05 05:53:47.6550079 +0000 UTC m=+46.935019112" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.662883 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" event={"ID":"8cc2c35c-7bb7-4475-a318-0133139b9359","Type":"ContainerStarted","Data":"ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.663420 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.664669 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" event={"ID":"dc34f331-e440-4ee3-98db-070d3115e2fd","Type":"ContainerStarted","Data":"4f4605cf05246471ef9d876ae3935dd507fbb6a2e5f5ba8c05de08b766265598"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.672245 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" event={"ID":"c3633127-3192-43e8-87a2-5049b2d82fa6","Type":"ContainerStarted","Data":"6a243abfb8d82cba56428be118baf7a49110fece54a2e7aedf3d587344667b9e"} Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.674067 4865 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hrb2j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.674114 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" podUID="8cc2c35c-7bb7-4475-a318-0133139b9359" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.674321 4865 patch_prober.go:28] interesting pod/console-operator-58897d9998-95g8c container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.674339 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-95g8c" podUID="60c9fd82-77d0-4c86-9ff6-0489fbeab324" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.699943 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.700623 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.200606404 +0000 UTC m=+47.480617627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.769263 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" podStartSLOduration=22.769245993 podStartE2EDuration="22.769245993s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:47.76773564 +0000 UTC m=+47.047746862" watchObservedRunningTime="2025-12-05 05:53:47.769245993 +0000 UTC m=+47.049257215" Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.800790 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.802190 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.302173107 +0000 UTC m=+47.582184329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:47 crc kubenswrapper[4865]: I1205 05:53:47.902462 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:47 crc kubenswrapper[4865]: E1205 05:53:47.902779 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.402768072 +0000 UTC m=+47.682779294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.004775 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.004910 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.50489389 +0000 UTC m=+47.784905112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.005015 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.005437 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.505380484 +0000 UTC m=+47.785391706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.105754 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.106056 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.606040721 +0000 UTC m=+47.886051943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.210215 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.224113 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.724091571 +0000 UTC m=+48.004102793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.313288 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.313707 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.813688464 +0000 UTC m=+48.093699686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.329012 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-drrtb"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.415042 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.415348 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:48.915335099 +0000 UTC m=+48.195346321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.506137 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.516208 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.516344 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.016312625 +0000 UTC m=+48.296323847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.516534 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.516939 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.016930772 +0000 UTC m=+48.296941994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.533858 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7lcjb"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.535563 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.537174 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb"] Dec 05 05:53:48 crc kubenswrapper[4865]: W1205 05:53:48.589398 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod277ec781_2698_4637_a619_e9bdc3fcabae.slice/crio-c395e4244d933858d3620585b763965e610a421453b76acef7d681a075eceae0 WatchSource:0}: Error finding container c395e4244d933858d3620585b763965e610a421453b76acef7d681a075eceae0: Status 404 returned error can't find the container with id c395e4244d933858d3620585b763965e610a421453b76acef7d681a075eceae0 Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.618939 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.619134 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.119097982 +0000 UTC m=+48.399109204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.619222 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.619490 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.119478773 +0000 UTC m=+48.399489995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.628679 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.641168 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.641208 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.642791 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.720593 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.720863 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.220798128 +0000 UTC m=+48.500809350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.720929 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.721255 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.221240911 +0000 UTC m=+48.501252133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.752832 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" event={"ID":"9813d882-ad86-4946-96ae-caa85c66aaab","Type":"ContainerStarted","Data":"b8e6e7f8e8ec3d18706172fd0931d0417124239b84c4802efaa142304c8f3ded"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.767867 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.771372 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8thws"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.777936 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" event={"ID":"63297c46-fc31-471d-99dc-47352f52e76c","Type":"ContainerStarted","Data":"78fdc64c1db786800d4d3ee1d2195cf23ea8a8a4207830b7d208409bab23f841"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.779924 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vqhhk"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.786004 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" event={"ID":"d60e1629-83b6-4492-bd6c-c0ed90da02be","Type":"ContainerStarted","Data":"bf6908eb5c505f14f8230eea34dfed02b8f8547650eb9bc2d2216e471543ebd7"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.789389 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d6z25"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.793014 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7lcjb" event={"ID":"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c","Type":"ContainerStarted","Data":"8648861c94db8e76aa3afdb07f85455c104f9e8c251069581780b18556efb4c1"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.796420 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-26v4f" event={"ID":"d930639e-351d-4089-88b9-966335507daf","Type":"ContainerStarted","Data":"edeb599a0ac3de13a129b4102830619826d416286b33cdc8a57a7aa5dec30476"} Dec 05 05:53:48 crc kubenswrapper[4865]: W1205 05:53:48.796905 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45637469_1150_4175_a642_a33eeb1c7a9d.slice/crio-36e867086659e46a5785071c4f90414a3940f5f5c29ae6831a475683929c0ae2 WatchSource:0}: Error finding container 36e867086659e46a5785071c4f90414a3940f5f5c29ae6831a475683929c0ae2: Status 404 returned error can't find the container with id 36e867086659e46a5785071c4f90414a3940f5f5c29ae6831a475683929c0ae2 Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.798766 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" event={"ID":"d4c0391a-598b-4504-9601-c1b362c3060c","Type":"ContainerStarted","Data":"0cc9a1ec3f250e2957d5580d5069d4fe5a9d10615734c4394d048fa2e4ec0fdd"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.799786 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.804117 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ps4tl"] Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.810641 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h6t2z" event={"ID":"6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a","Type":"ContainerStarted","Data":"e08e3b53a9c6e7a470c32d33fb94a52a7a7fc6faa3391149414778b457fa3a3b"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.824599 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.825137 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.325117449 +0000 UTC m=+48.605128681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.827743 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" event={"ID":"277ec781-2698-4637-a619-e9bdc3fcabae","Type":"ContainerStarted","Data":"c395e4244d933858d3620585b763965e610a421453b76acef7d681a075eceae0"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.837185 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" event={"ID":"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8","Type":"ContainerStarted","Data":"83cf6467522ece67b5cbe26e2cf3a254c6540340feb1d46cdeacbf8f36eb6316"} Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.843941 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-h6t2z" podStartSLOduration=23.843804539 podStartE2EDuration="23.843804539s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:48.843771578 +0000 UTC m=+48.123782810" watchObservedRunningTime="2025-12-05 05:53:48.843804539 +0000 UTC m=+48.123815761" Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.846398 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:53:48 crc kubenswrapper[4865]: I1205 05:53:48.926238 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:48 crc kubenswrapper[4865]: E1205 05:53:48.929051 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.429034908 +0000 UTC m=+48.709046120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.027199 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.027487 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.527435251 +0000 UTC m=+48.807446473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.027565 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.027848 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.527835862 +0000 UTC m=+48.807847084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.102515 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.117680 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:49 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:49 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:49 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.117746 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.130730 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.131735 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.63171497 +0000 UTC m=+48.911726202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.133702 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bg82r" Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.233420 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.234168 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.734155728 +0000 UTC m=+49.014166950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.334595 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.334750 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.834725272 +0000 UTC m=+49.114736494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.334990 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.335326 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.835318959 +0000 UTC m=+49.115330181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.438143 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-95g8c" Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.439294 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.439728 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:49.939716312 +0000 UTC m=+49.219727534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.544653 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.544989 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.044977019 +0000 UTC m=+49.324988241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.665787 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.666184 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.166167789 +0000 UTC m=+49.446179011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.766906 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.767565 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.267554316 +0000 UTC m=+49.547565538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.872316 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.872752 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.372737391 +0000 UTC m=+49.652748613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.913004 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" event={"ID":"d871e232-b94a-4c90-b2c1-775f93eaa51e","Type":"ContainerStarted","Data":"f9bef6bc621f642425322561844766dcb7f9ca8485fa111dc9f654b0f6a1918c"} Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.951772 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" event={"ID":"26c8d7ff-d655-4d40-b99c-e11daec5b263","Type":"ContainerStarted","Data":"5d7f81be1f1cfae94bbb439814ce0d19f9890e4c12098ce5df8cbef60486d7fa"} Dec 05 05:53:49 crc kubenswrapper[4865]: I1205 05:53:49.973460 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:49 crc kubenswrapper[4865]: E1205 05:53:49.975460 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.47381474 +0000 UTC m=+49.753825962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.056959 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" event={"ID":"d02c7cde-9b06-4366-9ad0-b46d403446b1","Type":"ContainerStarted","Data":"075acd76f019cda3bf3004165b1d5c4e689e79ffa7d80d01d2c47d5ebdae6280"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.068361 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" event={"ID":"dc34f331-e440-4ee3-98db-070d3115e2fd","Type":"ContainerStarted","Data":"7ba829c282197a2d1bfe501a6445a1ed524048386d2e79fa15847e993e72d9ab"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.080709 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lclsv" event={"ID":"dff0db39-9f6f-4455-8cab-8d4cdce33b04","Type":"ContainerStarted","Data":"68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.081603 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.082130 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.582101423 +0000 UTC m=+49.862112645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.120310 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:50 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:50 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:50 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.120365 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.120893 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6s5ts" podStartSLOduration=25.120881554 podStartE2EDuration="25.120881554s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.021999117 +0000 UTC m=+49.302010329" watchObservedRunningTime="2025-12-05 05:53:50.120881554 +0000 UTC m=+49.400892776" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.122562 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" podStartSLOduration=25.122554751 podStartE2EDuration="25.122554751s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.122060577 +0000 UTC m=+49.402071789" watchObservedRunningTime="2025-12-05 05:53:50.122554751 +0000 UTC m=+49.402565973" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.125102 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j8p6s" event={"ID":"397265e2-4c59-42f5-8774-357048ba57ac","Type":"ContainerStarted","Data":"da715b4c67d6c97f512bcd9c22cb8445b25918a342f6e3b637ca1e9f2d36dcfe"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.141803 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" event={"ID":"7e31d52b-ab95-489d-bbff-34e4b0daf602","Type":"ContainerStarted","Data":"e41018a311e20f2557c8cbb279eeab76a757dda25ff3660edebe576d66cc5027"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.182954 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.185733 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.685720244 +0000 UTC m=+49.965731466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.242447 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lclsv" podStartSLOduration=25.242430263 podStartE2EDuration="25.242430263s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.189968654 +0000 UTC m=+49.469979876" watchObservedRunningTime="2025-12-05 05:53:50.242430263 +0000 UTC m=+49.522441485" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.243941 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-cjczz" podStartSLOduration=25.243931556 podStartE2EDuration="25.243931556s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.241855487 +0000 UTC m=+49.521866709" watchObservedRunningTime="2025-12-05 05:53:50.243931556 +0000 UTC m=+49.523942778" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.293468 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.293743 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.793722169 +0000 UTC m=+50.073733391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.293956 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.294235 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.794228203 +0000 UTC m=+50.074239425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.297098 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" event={"ID":"45637469-1150-4175-a642-a33eeb1c7a9d","Type":"ContainerStarted","Data":"36e867086659e46a5785071c4f90414a3940f5f5c29ae6831a475683929c0ae2"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.311412 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" event={"ID":"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00","Type":"ContainerStarted","Data":"ca3843956e6f2f48cfb0495c956911b8c5fd0e7853c2411b78c8dadb008ee491"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.369224 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" event={"ID":"d4c0391a-598b-4504-9601-c1b362c3060c","Type":"ContainerStarted","Data":"ef95e802da6359310ceb803e2c7226dd47ab0b3c51c39c1000aee88532f4d9d2"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.398237 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.401366 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.901330153 +0000 UTC m=+50.181341385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.401555 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.403223 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:50.903211837 +0000 UTC m=+50.183223059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.449140 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" event={"ID":"ab8b1868-5f2d-4c43-a63c-525760727b75","Type":"ContainerStarted","Data":"a526496e1de8b1799fc5c0826451730e6acd89ff311034f02690e66b6ede9484"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.486598 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ps4tl" event={"ID":"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5","Type":"ContainerStarted","Data":"0d024e6dcb1aeb93f23321275c41d245099b2488f86c241902523e1482ef3dfb"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.498804 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d6z25" event={"ID":"55de6799-e76b-4493-a007-49cd203e7573","Type":"ContainerStarted","Data":"f924f65b9b4c391bec705c830e4705e0acaaf34ffebabbe1edc635cc16faacbe"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.503669 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.504147 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.004131671 +0000 UTC m=+50.284142893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.508063 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" event={"ID":"910872c7-f875-4697-9a0a-3ae986b9fca7","Type":"ContainerStarted","Data":"0673903063d7a8273d033f79a4f294b29c6bf80e53cc171622816c015719f341"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.508110 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" event={"ID":"910872c7-f875-4697-9a0a-3ae986b9fca7","Type":"ContainerStarted","Data":"e9677a95a80e46204ba43c206cdf6972210d952bfdbfb258dd5c3a97fa3425c3"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.537021 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" event={"ID":"3f657509-34dd-4ea9-84fb-ce4548fde2f3","Type":"ContainerStarted","Data":"a54ff1787920a153450895c977a64f4a7c3641ae6d9220bfe8fc23a4030159d7"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.553682 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" event={"ID":"d60e1629-83b6-4492-bd6c-c0ed90da02be","Type":"ContainerStarted","Data":"600c03f669aea52a78fb1980f4670028153595dc18e964c5d30397e0e471a32e"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.579955 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w28tr" podStartSLOduration=25.579940791 podStartE2EDuration="25.579940791s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.577869612 +0000 UTC m=+49.857880834" watchObservedRunningTime="2025-12-05 05:53:50.579940791 +0000 UTC m=+49.859952013" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.596226 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7lcjb" event={"ID":"cc67cfa4-a4e5-48c4-9f1e-fa56ac7ab17c","Type":"ContainerStarted","Data":"48d846117a1d0d7bcf9f961d70b3bad60262ed9d05e69026dd722cd87c180d65"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.619343 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.620686 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.120667737 +0000 UTC m=+50.400679019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.625948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" event={"ID":"9813d882-ad86-4946-96ae-caa85c66aaab","Type":"ContainerStarted","Data":"50e86296e1fa530e8c544310370e94ea4d8e233a481452c440e2961d2d347b1c"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.627081 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.630567 4865 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-h4jvf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.630602 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" podUID="9813d882-ad86-4946-96ae-caa85c66aaab" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.660699 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" podStartSLOduration=25.660677842 podStartE2EDuration="25.660677842s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.621437289 +0000 UTC m=+49.901448541" watchObservedRunningTime="2025-12-05 05:53:50.660677842 +0000 UTC m=+49.940689064" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.670648 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" event={"ID":"320fa4b0-5e00-4bca-b8f2-1afa7387d156","Type":"ContainerStarted","Data":"3604d14071bdd8605bdf5bc4c609b8d107ec52020bdf5af450a4ca724ee1d5db"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.672077 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" event={"ID":"1f49a368-065d-4057-a044-a019eba9ce9e","Type":"ContainerStarted","Data":"1f3ab2feb01855bef43c2cbb2b494a7ec28e6428d9d18f10664106b67e7eb194"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.687212 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" event={"ID":"ed63d1de-99a4-4b79-9aee-49a5484d968f","Type":"ContainerStarted","Data":"8712d55fccf81075cfd56ea0d36cfad5778856a77fc81d29b4e20de617a1372e"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.688098 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.694080 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" podStartSLOduration=25.69404942 podStartE2EDuration="25.69404942s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.693894255 +0000 UTC m=+49.973905477" watchObservedRunningTime="2025-12-05 05:53:50.69404942 +0000 UTC m=+49.974060632" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.695061 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7lcjb" podStartSLOduration=10.695053228 podStartE2EDuration="10.695053228s" podCreationTimestamp="2025-12-05 05:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.66340016 +0000 UTC m=+49.943411382" watchObservedRunningTime="2025-12-05 05:53:50.695053228 +0000 UTC m=+49.975064450" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.706048 4865 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jm2h container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.706117 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" podUID="ed63d1de-99a4-4b79-9aee-49a5484d968f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.724272 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.724412 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.224394391 +0000 UTC m=+50.504405613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.724500 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.725046 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.225037429 +0000 UTC m=+50.505048651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.726402 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" event={"ID":"c3633127-3192-43e8-87a2-5049b2d82fa6","Type":"ContainerStarted","Data":"f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.726980 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.760009 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" event={"ID":"5293d191-528f-4818-b897-11bb456c2b50","Type":"ContainerStarted","Data":"54b95e94e37a853b468bc575828f4ed968a6f7f0d867163a6c5be63a4a0b5b54"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.829368 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.830524 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.330506592 +0000 UTC m=+50.610517824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.833096 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" event={"ID":"87a82cae-057c-47d5-9703-eb48128e1bd9","Type":"ContainerStarted","Data":"e83abea23c54966ab2a465c991a97efda4b707d5d861cbf48efeff3081a32f03"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.833912 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.844948 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" podStartSLOduration=25.844924942 podStartE2EDuration="25.844924942s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.833032464 +0000 UTC m=+50.113043696" watchObservedRunningTime="2025-12-05 05:53:50.844924942 +0000 UTC m=+50.124936164" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.846065 4865 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-82xnx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.846120 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.886284 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.897447 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" event={"ID":"c0cebc10-c0ad-419c-903c-341c516f1527","Type":"ContainerStarted","Data":"626491cb0694102c957b4ce185281cf41e593021e58d46a69dd50c3afb3936c5"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.914105 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podStartSLOduration=10.914088914 podStartE2EDuration="10.914088914s" podCreationTimestamp="2025-12-05 05:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.888678473 +0000 UTC m=+50.168689695" watchObservedRunningTime="2025-12-05 05:53:50.914088914 +0000 UTC m=+50.194100136" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.914746 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fsjkk"] Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.915759 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:50 crc kubenswrapper[4865]: W1205 05:53:50.926143 4865 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.926188 4865 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.939242 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" event={"ID":"cc594832-ffe0-4764-95ac-f62739f0314e","Type":"ContainerStarted","Data":"5be68b69a48dc568df25109d47f11004aabd1285691ce467763e2324c2eccd84"} Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.939539 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:50 crc kubenswrapper[4865]: E1205 05:53:50.940573 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.440560996 +0000 UTC m=+50.720572218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.941016 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" podStartSLOduration=25.941000268 podStartE2EDuration="25.941000268s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:50.938975041 +0000 UTC m=+50.218986263" watchObservedRunningTime="2025-12-05 05:53:50.941000268 +0000 UTC m=+50.221011490" Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.943094 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsjkk"] Dec 05 05:53:50 crc kubenswrapper[4865]: I1205 05:53:50.968475 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" event={"ID":"51c9c322-fb54-438d-a353-d8deaa07b4fb","Type":"ContainerStarted","Data":"3b6661ef047e26b063d1a23b1595a1a465454ddcfe31a226677b9694069003a9"} Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.003742 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" event={"ID":"d2c86fc9-df64-415a-bfbe-8a6049dc4d55","Type":"ContainerStarted","Data":"3239032ce3f9484d6fb2c1b4c4d538da74ee5816e7c91370453324f3ce03e3b8"} Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.040335 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.040722 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-catalog-content\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.040772 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-utilities\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.040834 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t8j9\" (UniqueName: \"kubernetes.io/projected/fc0b366c-dba6-4a98-8335-e5434858e367-kube-api-access-9t8j9\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.040935 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.540919414 +0000 UTC m=+50.820930636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.107675 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:51 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:51 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:51 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.108191 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.141840 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-catalog-content\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.141900 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-utilities\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.141986 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t8j9\" (UniqueName: \"kubernetes.io/projected/fc0b366c-dba6-4a98-8335-e5434858e367-kube-api-access-9t8j9\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.142028 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.144918 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-catalog-content\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.145633 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-utilities\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.145676 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.645662387 +0000 UTC m=+50.925673689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.263885 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.341004 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.84096962 +0000 UTC m=+51.120980842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.343568 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bdk79"] Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.364737 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t8j9\" (UniqueName: \"kubernetes.io/projected/fc0b366c-dba6-4a98-8335-e5434858e367-kube-api-access-9t8j9\") pod \"certified-operators-fsjkk\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.376155 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.422595 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.476888 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdk79"] Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.476957 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5zhc2"] Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.482496 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" podStartSLOduration=26.482474296 podStartE2EDuration="26.482474296s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:51.473724267 +0000 UTC m=+50.753735499" watchObservedRunningTime="2025-12-05 05:53:51.482474296 +0000 UTC m=+50.762485518" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.485482 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.488929 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zhc2"] Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.490095 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-utilities\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.490243 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.490313 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-catalog-content\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.490373 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjbll\" (UniqueName: \"kubernetes.io/projected/8ed66f14-1ac8-456b-b3bb-d909c0164767-kube-api-access-bjbll\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.492999 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:51.992985494 +0000 UTC m=+51.272996716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.568386 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ccqrr"] Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.584390 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.592191 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.592552 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-utilities\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.592628 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-catalog-content\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.592675 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjbll\" (UniqueName: \"kubernetes.io/projected/8ed66f14-1ac8-456b-b3bb-d909c0164767-kube-api-access-bjbll\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.593314 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-utilities\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.593420 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.093399894 +0000 UTC m=+51.373411166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.598159 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-catalog-content\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.626727 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccqrr"] Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.671010 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjbll\" (UniqueName: \"kubernetes.io/projected/8ed66f14-1ac8-456b-b3bb-d909c0164767-kube-api-access-bjbll\") pod \"community-operators-bdk79\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694261 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psql8\" (UniqueName: \"kubernetes.io/projected/582e42c0-b2d0-4b24-900e-1316a155c471-kube-api-access-psql8\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694332 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-utilities\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694364 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42qmf\" (UniqueName: \"kubernetes.io/projected/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-kube-api-access-42qmf\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694395 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-utilities\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694446 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-catalog-content\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694467 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-catalog-content\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.694491 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.694807 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.194793241 +0000 UTC m=+51.474804463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.789257 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797267 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797540 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42qmf\" (UniqueName: \"kubernetes.io/projected/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-kube-api-access-42qmf\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797596 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-utilities\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797672 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-catalog-content\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797703 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-catalog-content\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797744 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psql8\" (UniqueName: \"kubernetes.io/projected/582e42c0-b2d0-4b24-900e-1316a155c471-kube-api-access-psql8\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.797784 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-utilities\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.798335 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-utilities\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.798428 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.298411722 +0000 UTC m=+51.578422944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.802747 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-catalog-content\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.809388 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-utilities\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.820125 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-catalog-content\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:51 crc kubenswrapper[4865]: I1205 05:53:51.898786 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:51 crc kubenswrapper[4865]: E1205 05:53:51.899335 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.399310856 +0000 UTC m=+51.679322078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.000583 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.001055 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.501040643 +0000 UTC m=+51.781051875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.034486 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" event={"ID":"910872c7-f875-4697-9a0a-3ae986b9fca7","Type":"ContainerStarted","Data":"293d3510e55296778024d308a6fc4ea12d2aa7f54c0c25deff25c623828135f6"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.034779 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42qmf\" (UniqueName: \"kubernetes.io/projected/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-kube-api-access-42qmf\") pod \"certified-operators-5zhc2\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.061251 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psql8\" (UniqueName: \"kubernetes.io/projected/582e42c0-b2d0-4b24-900e-1316a155c471-kube-api-access-psql8\") pod \"community-operators-ccqrr\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.067291 4865 generic.go:334] "Generic (PLEG): container finished" podID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerID="ca3843956e6f2f48cfb0495c956911b8c5fd0e7853c2411b78c8dadb008ee491" exitCode=0 Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.067387 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" event={"ID":"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00","Type":"ContainerDied","Data":"ca3843956e6f2f48cfb0495c956911b8c5fd0e7853c2411b78c8dadb008ee491"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.067435 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" event={"ID":"4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00","Type":"ContainerStarted","Data":"67070a4b47584733c64daf304482259d8bd9912f300f843cf0421fa676ff6d8f"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.079153 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" event={"ID":"51c9c322-fb54-438d-a353-d8deaa07b4fb","Type":"ContainerStarted","Data":"c0bed64d47096315255cb506f5de0eca1873ca2ab13435786d1b1a29e32dcfcd"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.105980 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.106374 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.606357712 +0000 UTC m=+51.886368934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.113670 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:52 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:52 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:52 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.113732 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.139717 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" event={"ID":"320fa4b0-5e00-4bca-b8f2-1afa7387d156","Type":"ContainerStarted","Data":"7845c2366faacd9368e2c51bed16cc1973390069d22a6e385e340e3976254ada"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.141811 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" event={"ID":"d2c86fc9-df64-415a-bfbe-8a6049dc4d55","Type":"ContainerStarted","Data":"93c6f64bd164a6af311c7fa26dba0b09b315bd4ea5c514ac1998a863323d6cd3"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.144378 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ps4tl" event={"ID":"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5","Type":"ContainerStarted","Data":"b9d214cc8ffc8d76695002678ce20f25001346f3e10c2ed1b9e3a3ca99fece8a"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.146046 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d6z25" event={"ID":"55de6799-e76b-4493-a007-49cd203e7573","Type":"ContainerStarted","Data":"5562eeeba0cbd4ebedaf9f7244d1a73fd8aa40c2ef7e53078aa86fcbe426df83"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.146734 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.151401 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.151462 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.152728 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" event={"ID":"277ec781-2698-4637-a619-e9bdc3fcabae","Type":"ContainerStarted","Data":"5176e5b9990c6543602f5e768578049c82d98fbaa3cc86eb74fd23cf7c8f6044"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.172235 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j8p6s" event={"ID":"397265e2-4c59-42f5-8774-357048ba57ac","Type":"ContainerStarted","Data":"a8bec5876374887a3db27d69d864989af69fe37eb65815247d195865cafc2152"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.207979 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.210479 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.710455386 +0000 UTC m=+51.990466608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.225056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" event={"ID":"c0cebc10-c0ad-419c-903c-341c516f1527","Type":"ContainerStarted","Data":"78ae2f1cd8a05f784ef9862c62f8db4fb5be08f436438ff84717f190f653df1f"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.231217 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.280155 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" event={"ID":"d871e232-b94a-4c90-b2c1-775f93eaa51e","Type":"ContainerStarted","Data":"7c9fe150ee770055d2125e9c8efb1dc35870e903b41a57e13eb6f64f216efc35"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.319614 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.320871 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.820858489 +0000 UTC m=+52.100869701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.326206 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" event={"ID":"d4c0391a-598b-4504-9601-c1b362c3060c","Type":"ContainerStarted","Data":"4877ef8fcd52b05ab6da6de4636cd2e04037343967dea30f428005b80f82861b"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.368535 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" event={"ID":"63297c46-fc31-471d-99dc-47352f52e76c","Type":"ContainerStarted","Data":"7662ac0fd55e36a6de4522dbfeeca2ca8f003a278ca10d5848145500eec2eeb3"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.420716 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.422424 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:52.922405911 +0000 UTC m=+52.202417133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.449224 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" event={"ID":"ab8b1868-5f2d-4c43-a63c-525760727b75","Type":"ContainerStarted","Data":"52fd600971750440335ab74ed03ad054ee7f9812c0ff801c0b521a405b2fd231"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.449285 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" event={"ID":"ab8b1868-5f2d-4c43-a63c-525760727b75","Type":"ContainerStarted","Data":"344a18bdb04527f5fd725526d3c88195ced43e4ef8f2e1718abc8fd082163c00"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.450643 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.478589 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" event={"ID":"26a6ffad-bce4-4d0f-a9ec-ebc85cb34cd8","Type":"ContainerStarted","Data":"ded85ddae4bcbc91793f9044cf68d28dff708c59ad68e930da6b70a3d3c3584e"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.504860 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.507375 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fmdqp" event={"ID":"5293d191-528f-4818-b897-11bb456c2b50","Type":"ContainerStarted","Data":"bf4de2266f5c10b1e9f8c5c5f14929b7f03a4df91476aa6540e9e31aeba2d2dd"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.512595 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.520813 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.523620 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.524012 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.023996834 +0000 UTC m=+52.304008056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.526756 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" event={"ID":"45637469-1150-4175-a642-a33eeb1c7a9d","Type":"ContainerStarted","Data":"c91abebe3592fc56799a04b1ae6f1fe3f3fae9d04189ed0eb75144d61c4fa19b"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.550128 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" event={"ID":"7e31d52b-ab95-489d-bbff-34e4b0daf602","Type":"ContainerStarted","Data":"6afff47d8bf82f4060272789cd00ec44bc346a83899cdfbdc1e1e4f9a0e1947c"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.586278 4865 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jm2h container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.586354 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" podUID="ed63d1de-99a4-4b79-9aee-49a5484d968f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.587048 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" event={"ID":"1f49a368-065d-4057-a044-a019eba9ce9e","Type":"ContainerStarted","Data":"420b599485c42d5136bba3e581869141e0c3970fbbaf3e429e65daa33707125d"} Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.587098 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.587700 4865 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8thws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.587726 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.591807 4865 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-82xnx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.591880 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.607217 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h4jvf" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.628288 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.629494 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.129479758 +0000 UTC m=+52.409490980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.700188 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-k9x8b"] Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.734760 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.735344 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.235324852 +0000 UTC m=+52.515336124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.845041 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.846309 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.346290742 +0000 UTC m=+52.626301964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.946904 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:52 crc kubenswrapper[4865]: E1205 05:53:52.947199 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.447188115 +0000 UTC m=+52.727199337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.951887 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jjg7p"] Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.957766 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.974356 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.974393 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:53:52 crc kubenswrapper[4865]: I1205 05:53:52.983510 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.001724 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjg7p"] Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.047639 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.047992 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbw8f\" (UniqueName: \"kubernetes.io/projected/f43580fe-7567-4fb7-b1fc-203bda11942a-kube-api-access-sbw8f\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.048073 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-catalog-content\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.048119 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-utilities\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.048235 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.548217502 +0000 UTC m=+52.828228724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.109994 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:53 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:53 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:53 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.110383 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.129437 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.129479 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.212513 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-catalog-content\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.212656 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-utilities\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.212732 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbw8f\" (UniqueName: \"kubernetes.io/projected/f43580fe-7567-4fb7-b1fc-203bda11942a-kube-api-access-sbw8f\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.212798 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.213184 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.713170744 +0000 UTC m=+52.993181966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.213575 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-catalog-content\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.213920 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-utilities\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.297346 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbw8f\" (UniqueName: \"kubernetes.io/projected/f43580fe-7567-4fb7-b1fc-203bda11942a-kube-api-access-sbw8f\") pod \"redhat-marketplace-jjg7p\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.313789 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.313956 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.813932264 +0000 UTC m=+53.093943486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.314108 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.314895 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.814882991 +0000 UTC m=+53.094894213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.346536 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v5wv7" podStartSLOduration=28.346517268 podStartE2EDuration="28.346517268s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:53.313154902 +0000 UTC m=+52.593166124" watchObservedRunningTime="2025-12-05 05:53:53.346517268 +0000 UTC m=+52.626528490" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.348130 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mds9l"] Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.349274 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.417114 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.417312 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.917291787 +0000 UTC m=+53.197302999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.417386 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6294\" (UniqueName: \"kubernetes.io/projected/bbe8803a-815d-4318-bfaa-1949755ed910-kube-api-access-b6294\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.417440 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-utilities\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.417607 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.417668 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-catalog-content\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.417978 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:53.917963466 +0000 UTC m=+53.197974758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.435417 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mds9l"] Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.522124 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.522396 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-catalog-content\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.522471 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6294\" (UniqueName: \"kubernetes.io/projected/bbe8803a-815d-4318-bfaa-1949755ed910-kube-api-access-b6294\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.522495 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-utilities\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.522956 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-utilities\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.523015 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.022991597 +0000 UTC m=+53.303002819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.523388 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-catalog-content\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.592355 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.595406 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kxq8c" podStartSLOduration=28.595393542 podStartE2EDuration="28.595393542s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:53.567331545 +0000 UTC m=+52.847342767" watchObservedRunningTime="2025-12-05 05:53:53.595393542 +0000 UTC m=+52.875404764" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.622004 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6294\" (UniqueName: \"kubernetes.io/projected/bbe8803a-815d-4318-bfaa-1949755ed910-kube-api-access-b6294\") pod \"redhat-marketplace-mds9l\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.623413 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.623725 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.123714186 +0000 UTC m=+53.403725408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.624109 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" event={"ID":"d871e232-b94a-4c90-b2c1-775f93eaa51e","Type":"ContainerStarted","Data":"a45ac249c109f838df1c926779cc84668d3ee216fc3af93f4fc4496ee57e4e9f"} Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.630118 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ps4tl" event={"ID":"0ee62aa6-06b7-435e-802f-fc4e6a9ca0a5","Type":"ContainerStarted","Data":"dee3f192e3fe5db93037ca2979ba1e3d6422b27e72950d6c3e274469bc12e64c"} Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.630220 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ps4tl" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.632278 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.635999 4865 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8thws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.636047 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.636359 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.636378 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.684556 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.727728 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.730097 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.230075714 +0000 UTC m=+53.510086936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.784298 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccqrr"] Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.829908 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.830250 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.330238457 +0000 UTC m=+53.610249679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.929048 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:53 crc kubenswrapper[4865]: I1205 05:53:53.931331 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:53 crc kubenswrapper[4865]: E1205 05:53:53.931644 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.431628375 +0000 UTC m=+53.711639597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.036947 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.037470 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.537455208 +0000 UTC m=+53.817466430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.038665 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdk79"] Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.078851 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-26v4f" podStartSLOduration=14.078827422 podStartE2EDuration="14.078827422s" podCreationTimestamp="2025-12-05 05:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.077134614 +0000 UTC m=+53.357145836" watchObservedRunningTime="2025-12-05 05:53:54.078827422 +0000 UTC m=+53.358838644" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.107061 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:54 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:54 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:54 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.107125 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.138357 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.138685 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.63866974 +0000 UTC m=+53.918680962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.233856 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j8p6s" podStartSLOduration=29.23382481 podStartE2EDuration="29.23382481s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.231650078 +0000 UTC m=+53.511661300" watchObservedRunningTime="2025-12-05 05:53:54.23382481 +0000 UTC m=+53.513836032" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.240059 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.240683 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.740666304 +0000 UTC m=+54.020677526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.300860 4865 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-82xnx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.300924 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.342224 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.342640 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.842624898 +0000 UTC m=+54.122636120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.356026 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" podStartSLOduration=29.356011148 podStartE2EDuration="29.356011148s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.353682732 +0000 UTC m=+53.633693964" watchObservedRunningTime="2025-12-05 05:53:54.356011148 +0000 UTC m=+53.636022370" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.368259 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xljcc"] Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.369534 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.408147 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.444674 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-catalog-content\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.444727 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrgt\" (UniqueName: \"kubernetes.io/projected/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-kube-api-access-prrgt\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.444782 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-utilities\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.444828 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.445125 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:54.945113847 +0000 UTC m=+54.225125069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.489460 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d6z25" podStartSLOduration=29.489445885 podStartE2EDuration="29.489445885s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.42832142 +0000 UTC m=+53.708332642" watchObservedRunningTime="2025-12-05 05:53:54.489445885 +0000 UTC m=+53.769457107" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.490801 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podStartSLOduration=29.490794793 podStartE2EDuration="29.490794793s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.487440558 +0000 UTC m=+53.767451770" watchObservedRunningTime="2025-12-05 05:53:54.490794793 +0000 UTC m=+53.770806015" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.545494 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.545761 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.045736562 +0000 UTC m=+54.325747784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.545946 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-catalog-content\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.545984 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrgt\" (UniqueName: \"kubernetes.io/projected/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-kube-api-access-prrgt\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.546190 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-utilities\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.549810 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-catalog-content\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.549986 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-utilities\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.561729 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p8dtw" podStartSLOduration=29.561713796 podStartE2EDuration="29.561713796s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.558198346 +0000 UTC m=+53.838209568" watchObservedRunningTime="2025-12-05 05:53:54.561713796 +0000 UTC m=+53.841725018" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.622623 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xljcc"] Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.639633 4865 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jm2h container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.640064 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" podUID="ed63d1de-99a4-4b79-9aee-49a5484d968f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.648490 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.648980 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.148965442 +0000 UTC m=+54.428976664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.663255 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerStarted","Data":"25bd2317d0ad9c82e708aead223c59477c04dca7149e90b3e4a14f7287917644"} Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.663308 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerStarted","Data":"34eac7efd1c866225c8af0dd70f2041b8e7475bb73714da5dac8f3a1fcacf567"} Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.665485 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerStarted","Data":"66d18640f9d306b894c65efc4cae963b490435ddfb4178bc3cea859ed1eb904e"} Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.665524 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerStarted","Data":"6f0bd87a1ca4479772d2d0bcfccbd9bfccf93e0ccca1c150d7681d30c8004184"} Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.671395 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l4sds" podStartSLOduration=29.671380458 podStartE2EDuration="29.671380458s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.670244826 +0000 UTC m=+53.950256048" watchObservedRunningTime="2025-12-05 05:53:54.671380458 +0000 UTC m=+53.951391700" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.682925 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" event={"ID":"63297c46-fc31-471d-99dc-47352f52e76c","Type":"ContainerStarted","Data":"dcdbabda80d5e6038d6ba98177c4f0d80bf5c9fab6c0c71a286e651bf332eb96"} Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.683091 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" gracePeriod=30 Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.683593 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.683730 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.686270 4865 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxzfq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.686313 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podUID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.699205 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mfc6c" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.749815 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.754868 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.254809646 +0000 UTC m=+54.534820878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.830276 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrgt\" (UniqueName: \"kubernetes.io/projected/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-kube-api-access-prrgt\") pod \"redhat-operators-xljcc\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.857024 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.857423 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.357405908 +0000 UTC m=+54.637417130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.954802 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" podStartSLOduration=29.954786562 podStartE2EDuration="29.954786562s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:54.873473114 +0000 UTC m=+54.153484336" watchObservedRunningTime="2025-12-05 05:53:54.954786562 +0000 UTC m=+54.234797784" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.956734 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rw9pr"] Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.958046 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.958291 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:54 crc kubenswrapper[4865]: E1205 05:53:54.958706 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.458692403 +0000 UTC m=+54.738703625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:54 crc kubenswrapper[4865]: I1205 05:53:54.993998 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.064430 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6c8\" (UniqueName: \"kubernetes.io/projected/748082c2-70ae-4b67-9c21-ff6f32030822-kube-api-access-4j6c8\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.064912 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-catalog-content\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.064969 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.065113 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-utilities\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.065482 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.565469673 +0000 UTC m=+54.845480895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.098978 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.104195 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.105148 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:55 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:55 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:55 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.105189 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.116002 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.116554 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.116582 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.116630 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.116648 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.156082 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.156160 4865 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.177748 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.177783 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.178805 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.179142 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-utilities\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.179175 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6c8\" (UniqueName: \"kubernetes.io/projected/748082c2-70ae-4b67-9c21-ff6f32030822-kube-api-access-4j6c8\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.179194 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-catalog-content\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.180113 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.680077406 +0000 UTC m=+54.960088628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.181034 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-utilities\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.181461 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-catalog-content\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.195779 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rw9pr"] Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.215326 4865 patch_prober.go:28] interesting pod/console-f9d7485db-lclsv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.215389 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lclsv" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.280363 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.281636 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.781584256 +0000 UTC m=+55.061595478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.303216 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jm2h" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.337167 4865 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-82xnx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.337235 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342246 4865 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxzfq container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342292 4865 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8thws container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342294 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podUID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342305 4865 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8thws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342246 4865 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxzfq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342341 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podUID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342360 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.342318 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.389285 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.390686 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.890667662 +0000 UTC m=+55.170678874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.405038 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6c8\" (UniqueName: \"kubernetes.io/projected/748082c2-70ae-4b67-9c21-ff6f32030822-kube-api-access-4j6c8\") pod \"redhat-operators-rw9pr\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.443862 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vqhhk" podStartSLOduration=30.443846662 podStartE2EDuration="30.443846662s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:55.41630993 +0000 UTC m=+54.696321152" watchObservedRunningTime="2025-12-05 05:53:55.443846662 +0000 UTC m=+54.723857884" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.492229 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.492974 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:55.992959086 +0000 UTC m=+55.272970308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.593201 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.595542 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.595966 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.095951829 +0000 UTC m=+55.375963051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.700190 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.700547 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.200532347 +0000 UTC m=+55.480543569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.752206 4865 generic.go:334] "Generic (PLEG): container finished" podID="582e42c0-b2d0-4b24-900e-1316a155c471" containerID="25bd2317d0ad9c82e708aead223c59477c04dca7149e90b3e4a14f7287917644" exitCode=0 Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.752975 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerDied","Data":"25bd2317d0ad9c82e708aead223c59477c04dca7149e90b3e4a14f7287917644"} Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.755407 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.763473 4865 generic.go:334] "Generic (PLEG): container finished" podID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerID="66d18640f9d306b894c65efc4cae963b490435ddfb4178bc3cea859ed1eb904e" exitCode=0 Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.764589 4865 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxzfq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.764655 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podUID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.764953 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerDied","Data":"66d18640f9d306b894c65efc4cae963b490435ddfb4178bc3cea859ed1eb904e"} Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.798957 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-drrtb" podStartSLOduration=30.798937519 podStartE2EDuration="30.798937519s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:55.732657898 +0000 UTC m=+55.012669120" watchObservedRunningTime="2025-12-05 05:53:55.798937519 +0000 UTC m=+55.078948741" Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.799346 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsjkk"] Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.801881 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.802034 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.302010967 +0000 UTC m=+55.582022199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.802362 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.805602 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.305587678 +0000 UTC m=+55.585598900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:55 crc kubenswrapper[4865]: I1205 05:53:55.903381 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:55 crc kubenswrapper[4865]: E1205 05:53:55.903805 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.403790335 +0000 UTC m=+55.683801557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.005485 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.005840 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.505813421 +0000 UTC m=+55.785824643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.101846 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:56 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:56 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:56 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.101901 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.106393 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.106562 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.606531859 +0000 UTC m=+55.886543081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.106641 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.106684 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.106750 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.106872 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.106904 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.106989 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.606981252 +0000 UTC m=+55.886992474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.113214 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.113343 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.120741 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.126679 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.156383 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zhc2"] Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.202437 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" podStartSLOduration=31.202421531 podStartE2EDuration="31.202421531s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:56.190222694 +0000 UTC m=+55.470233916" watchObservedRunningTime="2025-12-05 05:53:56.202421531 +0000 UTC m=+55.482432753" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.203017 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mds9l"] Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.209703 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.210525 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.210936 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.710917192 +0000 UTC m=+55.990928414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.252471 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjg7p"] Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.276822 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.311969 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.312291 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.812280278 +0000 UTC m=+56.092291500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.323342 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hgrgd" podStartSLOduration=31.323321402 podStartE2EDuration="31.323321402s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:56.314665246 +0000 UTC m=+55.594676468" watchObservedRunningTime="2025-12-05 05:53:56.323321402 +0000 UTC m=+55.603332624" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.418712 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.419096 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:56.91908071 +0000 UTC m=+56.199091932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.436313 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.525626 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.526022 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.026011164 +0000 UTC m=+56.306022386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.658127 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.751715 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.159067431 +0000 UTC m=+56.439078653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.755607 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ps4tl" podStartSLOduration=16.75559294 podStartE2EDuration="16.75559294s" podCreationTimestamp="2025-12-05 05:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:56.657052013 +0000 UTC m=+55.937063245" watchObservedRunningTime="2025-12-05 05:53:56.75559294 +0000 UTC m=+56.035604162" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.761031 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.761431 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.261415725 +0000 UTC m=+56.541426947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.784924 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qvs5j" podStartSLOduration=31.784901542 podStartE2EDuration="31.784901542s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:56.782127243 +0000 UTC m=+56.062138475" watchObservedRunningTime="2025-12-05 05:53:56.784901542 +0000 UTC m=+56.064912764" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.862239 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.862563 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.362548386 +0000 UTC m=+56.642559608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.956172 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-v2jhh" podStartSLOduration=31.956153682 podStartE2EDuration="31.956153682s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:56.939729436 +0000 UTC m=+56.219740658" watchObservedRunningTime="2025-12-05 05:53:56.956153682 +0000 UTC m=+56.236164904" Dec 05 05:53:56 crc kubenswrapper[4865]: I1205 05:53:56.966284 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:56 crc kubenswrapper[4865]: E1205 05:53:56.966593 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.466580018 +0000 UTC m=+56.746591240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.077739 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.078628 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.578609417 +0000 UTC m=+56.858620639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.108284 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zhc2" event={"ID":"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e","Type":"ContainerStarted","Data":"32595a5c9c95753de79666b4c4d850efdb66451a5c28f95170105b4d0b53b388"} Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.108320 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" event={"ID":"63297c46-fc31-471d-99dc-47352f52e76c","Type":"ContainerStarted","Data":"773d37e2f96940747bc6958aef9376b7c68520d4b6590033add06ebf01290c3a"} Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.109637 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:57 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:57 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:57 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.109689 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.123969 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zbsvt" podStartSLOduration=32.123945984 podStartE2EDuration="32.123945984s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:57.123628235 +0000 UTC m=+56.403639457" watchObservedRunningTime="2025-12-05 05:53:57.123945984 +0000 UTC m=+56.403957206" Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.181058 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.181373 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.681360034 +0000 UTC m=+56.961371256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.187258 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjkk" event={"ID":"fc0b366c-dba6-4a98-8335-e5434858e367","Type":"ContainerStarted","Data":"866030eaba4643461f3e17884d5436967d7d0e7fd465026e53fc7f73e1224b78"} Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.208882 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerStarted","Data":"48b466c65c1407c75f36ab9823eb5e3dbd882a4917f42a3a9e7f16bebed84e43"} Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.259994 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mds9l" event={"ID":"bbe8803a-815d-4318-bfaa-1949755ed910","Type":"ContainerStarted","Data":"44e296713a12c2a46acb991a34d0d5b61ce59a03fccb07b03c4e6c65a4c53454"} Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.276060 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zjllp" podStartSLOduration=32.276039601 podStartE2EDuration="32.276039601s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:53:57.27320667 +0000 UTC m=+56.553217892" watchObservedRunningTime="2025-12-05 05:53:57.276039601 +0000 UTC m=+56.556050823" Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.281717 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.283066 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.783047429 +0000 UTC m=+57.063058651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.383044 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.383387 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.883374387 +0000 UTC m=+57.163385609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.485711 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.486107 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:57.986088062 +0000 UTC m=+57.266099284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.586749 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.587064 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.087053537 +0000 UTC m=+57.367064759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.687477 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.687707 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.187693854 +0000 UTC m=+57.467705076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.693508 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xljcc"] Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.790703 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.791129 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.291108508 +0000 UTC m=+57.571119730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.892886 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.893591 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.393575336 +0000 UTC m=+57.673586558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:57 crc kubenswrapper[4865]: I1205 05:53:57.995478 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:57 crc kubenswrapper[4865]: E1205 05:53:57.995785 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.495772526 +0000 UTC m=+57.775783748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.133592 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.133987 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.633967699 +0000 UTC m=+57.913978921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.235882 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.236209 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.73619804 +0000 UTC m=+58.016209252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.339412 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.340132 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.840109099 +0000 UTC m=+58.120120331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.359324 4865 generic.go:334] "Generic (PLEG): container finished" podID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerID="d458832c57ba86e3d002b96d902ecceba3b53620221417213af4d9b88d7bd7bb" exitCode=0 Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.359410 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zhc2" event={"ID":"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e","Type":"ContainerDied","Data":"d458832c57ba86e3d002b96d902ecceba3b53620221417213af4d9b88d7bd7bb"} Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.374449 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerStarted","Data":"e3a74d3a20f449c76a1ee7645a9553a8c73e26f3b3fe573259dbafa95ccb8c43"} Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.389764 4865 generic.go:334] "Generic (PLEG): container finished" podID="fc0b366c-dba6-4a98-8335-e5434858e367" containerID="e8db9b028a2622ff385660a93281e21c61c7ac09f0e4e31755a5b4208700541c" exitCode=0 Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.389873 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjkk" event={"ID":"fc0b366c-dba6-4a98-8335-e5434858e367","Type":"ContainerDied","Data":"e8db9b028a2622ff385660a93281e21c61c7ac09f0e4e31755a5b4208700541c"} Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.420536 4865 generic.go:334] "Generic (PLEG): container finished" podID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerID="cf543d72370a72d55aa4e8c98442513b27a05de901e72cb03c4d3aebd5698f68" exitCode=0 Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.420635 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerDied","Data":"cf543d72370a72d55aa4e8c98442513b27a05de901e72cb03c4d3aebd5698f68"} Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.432170 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:58 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:58 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:58 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.432227 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.440806 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.441143 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:58.941121835 +0000 UTC m=+58.221133057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.441597 4865 generic.go:334] "Generic (PLEG): container finished" podID="bbe8803a-815d-4318-bfaa-1949755ed910" containerID="971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933" exitCode=0 Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.441628 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mds9l" event={"ID":"bbe8803a-815d-4318-bfaa-1949755ed910","Type":"ContainerDied","Data":"971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933"} Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.568434 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.569708 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.069687214 +0000 UTC m=+58.349698436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.672689 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.673013 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.173001836 +0000 UTC m=+58.453013058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.776552 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.776985 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.276964027 +0000 UTC m=+58.556975249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.887139 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.887910 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.387898035 +0000 UTC m=+58.667909257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:58 crc kubenswrapper[4865]: I1205 05:53:58.990509 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:58 crc kubenswrapper[4865]: E1205 05:53:58.990806 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.490790865 +0000 UTC m=+58.770802087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.039560 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rw9pr"] Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.043934 4865 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pv5k9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]log ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]etcd ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/max-in-flight-filter ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 05:53:59 crc kubenswrapper[4865]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 05:53:59 crc kubenswrapper[4865]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 05:53:59 crc kubenswrapper[4865]: livez check failed Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.044004 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" podUID="320fa4b0-5e00-4bca-b8f2-1afa7387d156" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.092832 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.093182 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.593165961 +0000 UTC m=+58.873177173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.121094 4865 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pv5k9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]log ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]etcd ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/max-in-flight-filter ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 05:53:59 crc kubenswrapper[4865]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-startinformers ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 05:53:59 crc kubenswrapper[4865]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 05:53:59 crc kubenswrapper[4865]: livez check failed Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.121172 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" podUID="320fa4b0-5e00-4bca-b8f2-1afa7387d156" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.127714 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:53:59 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:53:59 crc kubenswrapper[4865]: healthz check failed Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.128123 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.201691 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.201996 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.701979359 +0000 UTC m=+58.981990581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.306590 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.306927 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.806913697 +0000 UTC m=+59.086924919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.339021 4865 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxzfq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.339243 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podUID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.342954 4865 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxzfq container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.343004 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" podUID="4d95b3dd-ac64-4224-bd8a-8ab0d8beeb00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.407875 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.408323 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:53:59.908286374 +0000 UTC m=+59.188297596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.509021 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.509311 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.009299231 +0000 UTC m=+59.289310453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.521041 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerStarted","Data":"0a513d1494eb354e5c3747503c4cc292fa94e445c4f54c329cd0890da2b8e59a"} Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.566341 4865 generic.go:334] "Generic (PLEG): container finished" podID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerID="5775199bc83ffa7b87ff62243a5ad5dc2f5bec21d50ac73175d93f92060dc9cd" exitCode=0 Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.566438 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerDied","Data":"5775199bc83ffa7b87ff62243a5ad5dc2f5bec21d50ac73175d93f92060dc9cd"} Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.611352 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.611792 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.111775789 +0000 UTC m=+59.391787011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.615055 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" event={"ID":"63297c46-fc31-471d-99dc-47352f52e76c","Type":"ContainerStarted","Data":"f2de38705b44e9e999466f97b91c3c614cab4dd7b4114a115979ce2714a43d27"} Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.714699 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.715348 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.215334879 +0000 UTC m=+59.495346101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.816436 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.816595 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.316572262 +0000 UTC m=+59.596583484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.816731 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.817103 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.317084346 +0000 UTC m=+59.597095568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.920423 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.920498 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.420483341 +0000 UTC m=+59.700494553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.921173 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:53:59 crc kubenswrapper[4865]: E1205 05:53:59.922650 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.422632952 +0000 UTC m=+59.702644174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:53:59 crc kubenswrapper[4865]: I1205 05:53:59.962441 4865 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.028396 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.028686 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.528669911 +0000 UTC m=+59.808681123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.115294 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:00 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:00 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:00 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.115400 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.129751 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.130149 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.630136901 +0000 UTC m=+59.910148123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: W1205 05:54:00.213487 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2808510345c21e530b308aa04540be3f7b82f48619990825cf1b1acff4d420e6 WatchSource:0}: Error finding container 2808510345c21e530b308aa04540be3f7b82f48619990825cf1b1acff4d420e6: Status 404 returned error can't find the container with id 2808510345c21e530b308aa04540be3f7b82f48619990825cf1b1acff4d420e6 Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.232038 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.232470 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.732455305 +0000 UTC m=+60.012466527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.271456 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wsxc7" podStartSLOduration=20.271438411 podStartE2EDuration="20.271438411s" podCreationTimestamp="2025-12-05 05:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:54:00.269519027 +0000 UTC m=+59.549530249" watchObservedRunningTime="2025-12-05 05:54:00.271438411 +0000 UTC m=+59.551449633" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.334060 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.334397 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.834385878 +0000 UTC m=+60.114397100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.380752 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ps4tl" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.441629 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.441992 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.941977141 +0000 UTC m=+60.221988363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.442036 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.442297 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:00.94229028 +0000 UTC m=+60.222301502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.542993 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.545279 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:01.045228622 +0000 UTC m=+60.325239844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.630734 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.641039 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.642118 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.649101 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.649433 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 05:54:01.149417699 +0000 UTC m=+60.429428921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fth8" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.671113 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.671727 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.675030 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cbf96c651654494ada8ad09058e53197f21d9239dc9d9325968186aa48ddba64"} Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.682202 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b9dbab1a4ba9fd4ced0d4caf5b25cac331060cc202d0c8648adfb751ed691431"} Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.682278 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"474772e4628ba48a6c3cb5043953ac245618d78e71ba819adbed6e066d05e26b"} Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.684222 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.703721 4865 generic.go:334] "Generic (PLEG): container finished" podID="748082c2-70ae-4b67-9c21-ff6f32030822" containerID="595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055" exitCode=0 Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.704255 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerDied","Data":"595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055"} Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.724094 4865 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T05:53:59.962485483Z","Handler":null,"Name":""} Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.753718 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.754086 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc311200-23fa-499a-8c75-6450d4762f3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.754185 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc311200-23fa-499a-8c75-6450d4762f3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:00 crc kubenswrapper[4865]: E1205 05:54:00.754326 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 05:54:01.254305465 +0000 UTC m=+60.534316677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.786908 4865 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.786954 4865 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.796804 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2808510345c21e530b308aa04540be3f7b82f48619990825cf1b1acff4d420e6"} Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.857656 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.857698 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc311200-23fa-499a-8c75-6450d4762f3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.857765 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc311200-23fa-499a-8c75-6450d4762f3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.857850 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc311200-23fa-499a-8c75-6450d4762f3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.958100 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.980095 4865 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 05:54:00 crc kubenswrapper[4865]: I1205 05:54:00.980140 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.152584 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc311200-23fa-499a-8c75-6450d4762f3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.172794 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:01 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:01 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:01 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.172912 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.337289 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.338950 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.508663 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.509644 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.509720 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxzfq" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.509877 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.523702 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.523941 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.628992 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07656fb2-3659-4d14-8c59-72e8531ec4c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.629531 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07656fb2-3659-4d14-8c59-72e8531ec4c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.731317 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07656fb2-3659-4d14-8c59-72e8531ec4c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.731364 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07656fb2-3659-4d14-8c59-72e8531ec4c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.731480 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07656fb2-3659-4d14-8c59-72e8531ec4c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.791524 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07656fb2-3659-4d14-8c59-72e8531ec4c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:01 crc kubenswrapper[4865]: I1205 05:54:01.902304 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:01.994880 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"775a87e46c156faf9533024881753488a321f4eef495aa73287bf96c4196b6ac"} Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.107362 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:02 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:02 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:02 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.107743 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.168816 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fth8\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.257155 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.262929 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.263511 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.332171 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.992558 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 05:54:02 crc kubenswrapper[4865]: I1205 05:54:02.996780 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.059451 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.060053 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pv5k9" Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.124595 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:03 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:03 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:03 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.124659 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.304708 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b9a810731693d6a6be5d620277a28b4b8e28740a349ef3f11cb68ffa79fe4625"} Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.319582 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.819420 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fth8"] Dec 05 05:54:03 crc kubenswrapper[4865]: I1205 05:54:03.947148 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 05:54:04 crc kubenswrapper[4865]: I1205 05:54:04.103537 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:04 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:04 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:04 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:04 crc kubenswrapper[4865]: I1205 05:54:04.103583 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:04 crc kubenswrapper[4865]: I1205 05:54:04.377859 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc311200-23fa-499a-8c75-6450d4762f3d","Type":"ContainerStarted","Data":"86ebba7b666f2b320d271da0279fb387983c3e54e1ceafff94c358a69c32bca4"} Dec 05 05:54:04 crc kubenswrapper[4865]: I1205 05:54:04.555712 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" event={"ID":"5dbbc361-a522-4548-a14e-bdd061c7bc4b","Type":"ContainerStarted","Data":"af6bc3c359d7b5decbce624b71a763a5b4f6d7bc2a0c68d61a15813fcd1ded51"} Dec 05 05:54:04 crc kubenswrapper[4865]: I1205 05:54:04.653326 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07656fb2-3659-4d14-8c59-72e8531ec4c0","Type":"ContainerStarted","Data":"bd95bb2fd4ac6c94108a1f519a6cb809304c3faf8c37a83b5d7144cb6ee0bfa3"} Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.103847 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:05 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:05 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:05 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.103904 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.116586 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.116647 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.116591 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.116977 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:05 crc kubenswrapper[4865]: E1205 05:54:05.166330 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.182931 4865 patch_prober.go:28] interesting pod/console-f9d7485db-lclsv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.182994 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lclsv" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 05 05:54:05 crc kubenswrapper[4865]: E1205 05:54:05.265005 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:05 crc kubenswrapper[4865]: E1205 05:54:05.293325 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:05 crc kubenswrapper[4865]: E1205 05:54:05.293438 4865 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.323188 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.730204 4865 generic.go:334] "Generic (PLEG): container finished" podID="d60e1629-83b6-4492-bd6c-c0ed90da02be" containerID="600c03f669aea52a78fb1980f4670028153595dc18e964c5d30397e0e471a32e" exitCode=0 Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.730349 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" event={"ID":"d60e1629-83b6-4492-bd6c-c0ed90da02be","Type":"ContainerDied","Data":"600c03f669aea52a78fb1980f4670028153595dc18e964c5d30397e0e471a32e"} Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.733124 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" event={"ID":"5dbbc361-a522-4548-a14e-bdd061c7bc4b","Type":"ContainerStarted","Data":"4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e"} Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.736540 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:05 crc kubenswrapper[4865]: I1205 05:54:05.795172 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" podStartSLOduration=40.795144616 podStartE2EDuration="40.795144616s" podCreationTimestamp="2025-12-05 05:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:54:05.787736896 +0000 UTC m=+65.067748118" watchObservedRunningTime="2025-12-05 05:54:05.795144616 +0000 UTC m=+65.075155848" Dec 05 05:54:06 crc kubenswrapper[4865]: I1205 05:54:06.103324 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:06 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:06 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:06 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:06 crc kubenswrapper[4865]: I1205 05:54:06.103381 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:06 crc kubenswrapper[4865]: I1205 05:54:06.789658 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc311200-23fa-499a-8c75-6450d4762f3d","Type":"ContainerStarted","Data":"9db506cb39f99c048ff7eb768c40ee2de76c9af159c37ffabf0a34fc67df0ef4"} Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.153663 4865 patch_prober.go:28] interesting pod/router-default-5444994796-h6t2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 05:54:07 crc kubenswrapper[4865]: [-]has-synced failed: reason withheld Dec 05 05:54:07 crc kubenswrapper[4865]: [+]process-running ok Dec 05 05:54:07 crc kubenswrapper[4865]: healthz check failed Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.153729 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h6t2z" podUID="6a712fd8-b8cb-4ac2-955d-b19e4cd36c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.865900 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07656fb2-3659-4d14-8c59-72e8531ec4c0","Type":"ContainerStarted","Data":"1c7b5d14971d2e50a74a353eb1a31810c8cf0cb615d88f9d19f56c53f4f5a3df"} Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.868660 4865 generic.go:334] "Generic (PLEG): container finished" podID="cc311200-23fa-499a-8c75-6450d4762f3d" containerID="9db506cb39f99c048ff7eb768c40ee2de76c9af159c37ffabf0a34fc67df0ef4" exitCode=0 Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.869447 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc311200-23fa-499a-8c75-6450d4762f3d","Type":"ContainerDied","Data":"9db506cb39f99c048ff7eb768c40ee2de76c9af159c37ffabf0a34fc67df0ef4"} Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.911129 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=6.911111049 podStartE2EDuration="6.911111049s" podCreationTimestamp="2025-12-05 05:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:54:07.910360338 +0000 UTC m=+67.190371580" watchObservedRunningTime="2025-12-05 05:54:07.911111049 +0000 UTC m=+67.191122271" Dec 05 05:54:07 crc kubenswrapper[4865]: I1205 05:54:07.967870 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.104437 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wqh5\" (UniqueName: \"kubernetes.io/projected/d60e1629-83b6-4492-bd6c-c0ed90da02be-kube-api-access-7wqh5\") pod \"d60e1629-83b6-4492-bd6c-c0ed90da02be\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.104503 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60e1629-83b6-4492-bd6c-c0ed90da02be-config-volume\") pod \"d60e1629-83b6-4492-bd6c-c0ed90da02be\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.105546 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60e1629-83b6-4492-bd6c-c0ed90da02be-config-volume" (OuterVolumeSpecName: "config-volume") pod "d60e1629-83b6-4492-bd6c-c0ed90da02be" (UID: "d60e1629-83b6-4492-bd6c-c0ed90da02be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.206071 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60e1629-83b6-4492-bd6c-c0ed90da02be-secret-volume\") pod \"d60e1629-83b6-4492-bd6c-c0ed90da02be\" (UID: \"d60e1629-83b6-4492-bd6c-c0ed90da02be\") " Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.210468 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60e1629-83b6-4492-bd6c-c0ed90da02be-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.335817 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.407961 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-h6t2z" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.431790 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60e1629-83b6-4492-bd6c-c0ed90da02be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d60e1629-83b6-4492-bd6c-c0ed90da02be" (UID: "d60e1629-83b6-4492-bd6c-c0ed90da02be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.432124 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60e1629-83b6-4492-bd6c-c0ed90da02be-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.435042 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60e1629-83b6-4492-bd6c-c0ed90da02be-kube-api-access-7wqh5" (OuterVolumeSpecName: "kube-api-access-7wqh5") pod "d60e1629-83b6-4492-bd6c-c0ed90da02be" (UID: "d60e1629-83b6-4492-bd6c-c0ed90da02be"). InnerVolumeSpecName "kube-api-access-7wqh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.533250 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wqh5\" (UniqueName: \"kubernetes.io/projected/d60e1629-83b6-4492-bd6c-c0ed90da02be-kube-api-access-7wqh5\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.964518 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" event={"ID":"d60e1629-83b6-4492-bd6c-c0ed90da02be","Type":"ContainerDied","Data":"bf6908eb5c505f14f8230eea34dfed02b8f8547650eb9bc2d2216e471543ebd7"} Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.964576 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6908eb5c505f14f8230eea34dfed02b8f8547650eb9bc2d2216e471543ebd7" Dec 05 05:54:08 crc kubenswrapper[4865]: I1205 05:54:08.964669 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb" Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.462028 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.776229 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.887312 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc311200-23fa-499a-8c75-6450d4762f3d-kubelet-dir\") pod \"cc311200-23fa-499a-8c75-6450d4762f3d\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.887384 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc311200-23fa-499a-8c75-6450d4762f3d-kube-api-access\") pod \"cc311200-23fa-499a-8c75-6450d4762f3d\" (UID: \"cc311200-23fa-499a-8c75-6450d4762f3d\") " Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.899016 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc311200-23fa-499a-8c75-6450d4762f3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc311200-23fa-499a-8c75-6450d4762f3d" (UID: "cc311200-23fa-499a-8c75-6450d4762f3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.945052 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc311200-23fa-499a-8c75-6450d4762f3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc311200-23fa-499a-8c75-6450d4762f3d" (UID: "cc311200-23fa-499a-8c75-6450d4762f3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.991522 4865 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc311200-23fa-499a-8c75-6450d4762f3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:09 crc kubenswrapper[4865]: I1205 05:54:09.991552 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc311200-23fa-499a-8c75-6450d4762f3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:10 crc kubenswrapper[4865]: I1205 05:54:10.062893 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc311200-23fa-499a-8c75-6450d4762f3d","Type":"ContainerDied","Data":"86ebba7b666f2b320d271da0279fb387983c3e54e1ceafff94c358a69c32bca4"} Dec 05 05:54:10 crc kubenswrapper[4865]: I1205 05:54:10.062968 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86ebba7b666f2b320d271da0279fb387983c3e54e1ceafff94c358a69c32bca4" Dec 05 05:54:10 crc kubenswrapper[4865]: I1205 05:54:10.063023 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 05:54:11 crc kubenswrapper[4865]: I1205 05:54:11.179508 4865 generic.go:334] "Generic (PLEG): container finished" podID="07656fb2-3659-4d14-8c59-72e8531ec4c0" containerID="1c7b5d14971d2e50a74a353eb1a31810c8cf0cb615d88f9d19f56c53f4f5a3df" exitCode=0 Dec 05 05:54:11 crc kubenswrapper[4865]: I1205 05:54:11.181002 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07656fb2-3659-4d14-8c59-72e8531ec4c0","Type":"ContainerDied","Data":"1c7b5d14971d2e50a74a353eb1a31810c8cf0cb615d88f9d19f56c53f4f5a3df"} Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.031358 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.130401 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07656fb2-3659-4d14-8c59-72e8531ec4c0-kube-api-access\") pod \"07656fb2-3659-4d14-8c59-72e8531ec4c0\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.130452 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07656fb2-3659-4d14-8c59-72e8531ec4c0-kubelet-dir\") pod \"07656fb2-3659-4d14-8c59-72e8531ec4c0\" (UID: \"07656fb2-3659-4d14-8c59-72e8531ec4c0\") " Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.130947 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07656fb2-3659-4d14-8c59-72e8531ec4c0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "07656fb2-3659-4d14-8c59-72e8531ec4c0" (UID: "07656fb2-3659-4d14-8c59-72e8531ec4c0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.145997 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07656fb2-3659-4d14-8c59-72e8531ec4c0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "07656fb2-3659-4d14-8c59-72e8531ec4c0" (UID: "07656fb2-3659-4d14-8c59-72e8531ec4c0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.232363 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07656fb2-3659-4d14-8c59-72e8531ec4c0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.232895 4865 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07656fb2-3659-4d14-8c59-72e8531ec4c0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.266761 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07656fb2-3659-4d14-8c59-72e8531ec4c0","Type":"ContainerDied","Data":"bd95bb2fd4ac6c94108a1f519a6cb809304c3faf8c37a83b5d7144cb6ee0bfa3"} Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.267535 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd95bb2fd4ac6c94108a1f519a6cb809304c3faf8c37a83b5d7144cb6ee0bfa3" Dec 05 05:54:13 crc kubenswrapper[4865]: I1205 05:54:13.267810 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 05:54:15 crc kubenswrapper[4865]: E1205 05:54:15.080337 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.116593 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.116645 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.116692 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.117242 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"5562eeeba0cbd4ebedaf9f7244d1a73fd8aa40c2ef7e53078aa86fcbe426df83"} pod="openshift-console/downloads-7954f5f757-d6z25" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.117313 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" containerID="cri-o://5562eeeba0cbd4ebedaf9f7244d1a73fd8aa40c2ef7e53078aa86fcbe426df83" gracePeriod=2 Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.118411 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.118442 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.118720 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.118741 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:15 crc kubenswrapper[4865]: E1205 05:54:15.183203 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.184242 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:54:15 crc kubenswrapper[4865]: E1205 05:54:15.197762 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:15 crc kubenswrapper[4865]: E1205 05:54:15.197844 4865 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:54:15 crc kubenswrapper[4865]: I1205 05:54:15.198154 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 05:54:16 crc kubenswrapper[4865]: I1205 05:54:16.403753 4865 generic.go:334] "Generic (PLEG): container finished" podID="55de6799-e76b-4493-a007-49cd203e7573" containerID="5562eeeba0cbd4ebedaf9f7244d1a73fd8aa40c2ef7e53078aa86fcbe426df83" exitCode=0 Dec 05 05:54:16 crc kubenswrapper[4865]: I1205 05:54:16.404007 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d6z25" event={"ID":"55de6799-e76b-4493-a007-49cd203e7573","Type":"ContainerDied","Data":"5562eeeba0cbd4ebedaf9f7244d1a73fd8aa40c2ef7e53078aa86fcbe426df83"} Dec 05 05:54:19 crc kubenswrapper[4865]: I1205 05:54:19.052546 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 05:54:19 crc kubenswrapper[4865]: I1205 05:54:19.466448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d6z25" event={"ID":"55de6799-e76b-4493-a007-49cd203e7573","Type":"ContainerStarted","Data":"bf4058aa2cbc67e3124c7e3ab0032390a62e861be43c9dfd140ef0855829c3a2"} Dec 05 05:54:21 crc kubenswrapper[4865]: I1205 05:54:21.666746 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:54:21 crc kubenswrapper[4865]: I1205 05:54:21.666855 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:21 crc kubenswrapper[4865]: I1205 05:54:21.666887 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:21 crc kubenswrapper[4865]: I1205 05:54:21.686436 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.686414308 podStartE2EDuration="2.686414308s" podCreationTimestamp="2025-12-05 05:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:54:21.141467092 +0000 UTC m=+80.421478314" watchObservedRunningTime="2025-12-05 05:54:21.686414308 +0000 UTC m=+80.966425530" Dec 05 05:54:22 crc kubenswrapper[4865]: I1205 05:54:22.268798 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:54:22 crc kubenswrapper[4865]: I1205 05:54:22.707729 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:22 crc kubenswrapper[4865]: I1205 05:54:22.708118 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:25 crc kubenswrapper[4865]: E1205 05:54:25.065840 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:25 crc kubenswrapper[4865]: E1205 05:54:25.066389 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:25 crc kubenswrapper[4865]: E1205 05:54:25.066926 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:25 crc kubenswrapper[4865]: E1205 05:54:25.066959 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.119289 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.119304 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.119347 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.119365 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.338081 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whlbb" Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.921244 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-k9x8b_c3633127-3192-43e8-87a2-5049b2d82fa6/kube-multus-additional-cni-plugins/0.log" Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.921291 4865 generic.go:334] "Generic (PLEG): container finished" podID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" exitCode=137 Dec 05 05:54:25 crc kubenswrapper[4865]: I1205 05:54:25.921324 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" event={"ID":"c3633127-3192-43e8-87a2-5049b2d82fa6","Type":"ContainerDied","Data":"f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2"} Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.639281 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 05:54:34 crc kubenswrapper[4865]: E1205 05:54:34.640356 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e1629-83b6-4492-bd6c-c0ed90da02be" containerName="collect-profiles" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.640369 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e1629-83b6-4492-bd6c-c0ed90da02be" containerName="collect-profiles" Dec 05 05:54:34 crc kubenswrapper[4865]: E1205 05:54:34.640396 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc311200-23fa-499a-8c75-6450d4762f3d" containerName="pruner" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.640402 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc311200-23fa-499a-8c75-6450d4762f3d" containerName="pruner" Dec 05 05:54:34 crc kubenswrapper[4865]: E1205 05:54:34.640412 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07656fb2-3659-4d14-8c59-72e8531ec4c0" containerName="pruner" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.640420 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="07656fb2-3659-4d14-8c59-72e8531ec4c0" containerName="pruner" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.643164 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc311200-23fa-499a-8c75-6450d4762f3d" containerName="pruner" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.643201 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60e1629-83b6-4492-bd6c-c0ed90da02be" containerName="collect-profiles" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.643263 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="07656fb2-3659-4d14-8c59-72e8531ec4c0" containerName="pruner" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.644331 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.646860 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.651018 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.651623 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.820514 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.820564 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.922094 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.922192 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.922309 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.947242 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:34 crc kubenswrapper[4865]: I1205 05:54:34.976215 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:54:35 crc kubenswrapper[4865]: I1205 05:54:35.054004 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 05:54:35 crc kubenswrapper[4865]: E1205 05:54:35.064958 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:35 crc kubenswrapper[4865]: E1205 05:54:35.065790 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:35 crc kubenswrapper[4865]: E1205 05:54:35.066291 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:35 crc kubenswrapper[4865]: E1205 05:54:35.066363 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:54:35 crc kubenswrapper[4865]: I1205 05:54:35.116432 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:35 crc kubenswrapper[4865]: I1205 05:54:35.116786 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:35 crc kubenswrapper[4865]: I1205 05:54:35.116558 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:35 crc kubenswrapper[4865]: I1205 05:54:35.116897 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:36 crc kubenswrapper[4865]: I1205 05:54:36.290685 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 05:54:36 crc kubenswrapper[4865]: I1205 05:54:36.358956 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.358940402 podStartE2EDuration="1.358940402s" podCreationTimestamp="2025-12-05 05:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:54:36.338539354 +0000 UTC m=+95.618550576" watchObservedRunningTime="2025-12-05 05:54:36.358940402 +0000 UTC m=+95.638951624" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.425696 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.427033 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.447178 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.568718 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-var-lock\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.568777 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kube-api-access\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.568802 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.670599 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-var-lock\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.670994 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kube-api-access\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.671151 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.671261 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.670749 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-var-lock\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.691288 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kube-api-access\") pod \"installer-9-crc\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:39 crc kubenswrapper[4865]: I1205 05:54:39.750572 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:54:45 crc kubenswrapper[4865]: E1205 05:54:45.065323 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:45 crc kubenswrapper[4865]: E1205 05:54:45.066788 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:45 crc kubenswrapper[4865]: E1205 05:54:45.067066 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:45 crc kubenswrapper[4865]: E1205 05:54:45.067101 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.117248 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.117578 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.117284 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.117874 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.117962 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.118676 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.118708 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"bf4058aa2cbc67e3124c7e3ab0032390a62e861be43c9dfd140ef0855829c3a2"} pod="openshift-console/downloads-7954f5f757-d6z25" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.118740 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" containerID="cri-o://bf4058aa2cbc67e3124c7e3ab0032390a62e861be43c9dfd140ef0855829c3a2" gracePeriod=2 Dec 05 05:54:45 crc kubenswrapper[4865]: I1205 05:54:45.118735 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:54:46 crc kubenswrapper[4865]: I1205 05:54:46.166369 4865 generic.go:334] "Generic (PLEG): container finished" podID="55de6799-e76b-4493-a007-49cd203e7573" containerID="bf4058aa2cbc67e3124c7e3ab0032390a62e861be43c9dfd140ef0855829c3a2" exitCode=0 Dec 05 05:54:46 crc kubenswrapper[4865]: I1205 05:54:46.166423 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d6z25" event={"ID":"55de6799-e76b-4493-a007-49cd203e7573","Type":"ContainerDied","Data":"bf4058aa2cbc67e3124c7e3ab0032390a62e861be43c9dfd140ef0855829c3a2"} Dec 05 05:54:46 crc kubenswrapper[4865]: I1205 05:54:46.166483 4865 scope.go:117] "RemoveContainer" containerID="5562eeeba0cbd4ebedaf9f7244d1a73fd8aa40c2ef7e53078aa86fcbe426df83" Dec 05 05:54:55 crc kubenswrapper[4865]: E1205 05:54:55.065302 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:55 crc kubenswrapper[4865]: E1205 05:54:55.066376 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:55 crc kubenswrapper[4865]: E1205 05:54:55.066610 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 05:54:55 crc kubenswrapper[4865]: E1205 05:54:55.066641 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:54:55 crc kubenswrapper[4865]: I1205 05:54:55.118045 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:54:55 crc kubenswrapper[4865]: I1205 05:54:55.118109 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.143326 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-k9x8b_c3633127-3192-43e8-87a2-5049b2d82fa6/kube-multus-additional-cni-plugins/0.log" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.143978 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.184267 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3633127-3192-43e8-87a2-5049b2d82fa6-tuning-conf-dir\") pod \"c3633127-3192-43e8-87a2-5049b2d82fa6\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.185009 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3633127-3192-43e8-87a2-5049b2d82fa6-cni-sysctl-allowlist\") pod \"c3633127-3192-43e8-87a2-5049b2d82fa6\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.184475 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3633127-3192-43e8-87a2-5049b2d82fa6-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "c3633127-3192-43e8-87a2-5049b2d82fa6" (UID: "c3633127-3192-43e8-87a2-5049b2d82fa6"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.185868 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h9z2\" (UniqueName: \"kubernetes.io/projected/c3633127-3192-43e8-87a2-5049b2d82fa6-kube-api-access-9h9z2\") pod \"c3633127-3192-43e8-87a2-5049b2d82fa6\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.185947 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c3633127-3192-43e8-87a2-5049b2d82fa6-ready\") pod \"c3633127-3192-43e8-87a2-5049b2d82fa6\" (UID: \"c3633127-3192-43e8-87a2-5049b2d82fa6\") " Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.186066 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3633127-3192-43e8-87a2-5049b2d82fa6-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "c3633127-3192-43e8-87a2-5049b2d82fa6" (UID: "c3633127-3192-43e8-87a2-5049b2d82fa6"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.186459 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3633127-3192-43e8-87a2-5049b2d82fa6-ready" (OuterVolumeSpecName: "ready") pod "c3633127-3192-43e8-87a2-5049b2d82fa6" (UID: "c3633127-3192-43e8-87a2-5049b2d82fa6"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.186941 4865 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3633127-3192-43e8-87a2-5049b2d82fa6-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.186966 4865 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3633127-3192-43e8-87a2-5049b2d82fa6-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.186979 4865 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c3633127-3192-43e8-87a2-5049b2d82fa6-ready\") on node \"crc\" DevicePath \"\"" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.195949 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3633127-3192-43e8-87a2-5049b2d82fa6-kube-api-access-9h9z2" (OuterVolumeSpecName: "kube-api-access-9h9z2") pod "c3633127-3192-43e8-87a2-5049b2d82fa6" (UID: "c3633127-3192-43e8-87a2-5049b2d82fa6"). InnerVolumeSpecName "kube-api-access-9h9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.289024 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h9z2\" (UniqueName: \"kubernetes.io/projected/c3633127-3192-43e8-87a2-5049b2d82fa6-kube-api-access-9h9z2\") on node \"crc\" DevicePath \"\"" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.378099 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" event={"ID":"c3633127-3192-43e8-87a2-5049b2d82fa6","Type":"ContainerDied","Data":"6a243abfb8d82cba56428be118baf7a49110fece54a2e7aedf3d587344667b9e"} Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.378348 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-k9x8b" Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.420164 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-k9x8b"] Dec 05 05:55:01 crc kubenswrapper[4865]: I1205 05:55:01.425581 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-k9x8b"] Dec 05 05:55:03 crc kubenswrapper[4865]: I1205 05:55:03.013222 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" path="/var/lib/kubelet/pods/c3633127-3192-43e8-87a2-5049b2d82fa6/volumes" Dec 05 05:55:05 crc kubenswrapper[4865]: I1205 05:55:05.118493 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:05 crc kubenswrapper[4865]: I1205 05:55:05.119037 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:07 crc kubenswrapper[4865]: E1205 05:55:07.041211 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 05:55:07 crc kubenswrapper[4865]: E1205 05:55:07.041809 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42qmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5zhc2_openshift-marketplace(a13b3fdc-c602-48f5-bc10-e2e30df8cc0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:07 crc kubenswrapper[4865]: E1205 05:55:07.042947 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5zhc2" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.268485 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5zhc2" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.399746 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.400414 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prrgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xljcc_openshift-marketplace(04a4a0fc-43e5-4409-a8e5-bfa4b2525322): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.401803 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xljcc" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.402859 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.402983 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t8j9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fsjkk_openshift-marketplace(fc0b366c-dba6-4a98-8335-e5434858e367): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:11 crc kubenswrapper[4865]: E1205 05:55:11.404059 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fsjkk" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" Dec 05 05:55:13 crc kubenswrapper[4865]: E1205 05:55:13.703203 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fsjkk" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" Dec 05 05:55:13 crc kubenswrapper[4865]: E1205 05:55:13.703405 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xljcc" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" Dec 05 05:55:13 crc kubenswrapper[4865]: E1205 05:55:13.954781 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 05:55:13 crc kubenswrapper[4865]: E1205 05:55:13.954963 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psql8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ccqrr_openshift-marketplace(582e42c0-b2d0-4b24-900e-1316a155c471): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:13 crc kubenswrapper[4865]: E1205 05:55:13.956141 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ccqrr" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" Dec 05 05:55:14 crc kubenswrapper[4865]: E1205 05:55:14.393670 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 05:55:14 crc kubenswrapper[4865]: E1205 05:55:14.394055 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4j6c8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rw9pr_openshift-marketplace(748082c2-70ae-4b67-9c21-ff6f32030822): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:14 crc kubenswrapper[4865]: E1205 05:55:14.395895 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rw9pr" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" Dec 05 05:55:15 crc kubenswrapper[4865]: I1205 05:55:15.116309 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:15 crc kubenswrapper[4865]: I1205 05:55:15.116386 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:18 crc kubenswrapper[4865]: E1205 05:55:18.763571 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ccqrr" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" Dec 05 05:55:18 crc kubenswrapper[4865]: E1205 05:55:18.763958 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rw9pr" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" Dec 05 05:55:18 crc kubenswrapper[4865]: I1205 05:55:18.948632 4865 scope.go:117] "RemoveContainer" containerID="f7870d592ef6e54b94046515041a72b9bd25dd7682e6cf9c94d21bc4892a4ba2" Dec 05 05:55:19 crc kubenswrapper[4865]: I1205 05:55:19.058090 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.078169 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.078375 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6294,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mds9l_openshift-marketplace(bbe8803a-815d-4318-bfaa-1949755ed910): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.079629 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mds9l" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" Dec 05 05:55:19 crc kubenswrapper[4865]: W1205 05:55:19.135224 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0c32bdbe_2acd_4833_b0ca_a9b9eecae8e6.slice/crio-40a4d4921b905019f27448704a9aa62bb5715a87ed62750e2c47042eafa476e0 WatchSource:0}: Error finding container 40a4d4921b905019f27448704a9aa62bb5715a87ed62750e2c47042eafa476e0: Status 404 returned error can't find the container with id 40a4d4921b905019f27448704a9aa62bb5715a87ed62750e2c47042eafa476e0 Dec 05 05:55:19 crc kubenswrapper[4865]: I1205 05:55:19.189553 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 05:55:19 crc kubenswrapper[4865]: W1205 05:55:19.206101 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda8087a33_5a51_414f_9eea_6ffe70e9b9fe.slice/crio-6beed7e8651e6c05f3f29dd55c2c8c74c59191cfcae584e1081e15262ace1aaa WatchSource:0}: Error finding container 6beed7e8651e6c05f3f29dd55c2c8c74c59191cfcae584e1081e15262ace1aaa: Status 404 returned error can't find the container with id 6beed7e8651e6c05f3f29dd55c2c8c74c59191cfcae584e1081e15262ace1aaa Dec 05 05:55:19 crc kubenswrapper[4865]: I1205 05:55:19.486359 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6","Type":"ContainerStarted","Data":"40a4d4921b905019f27448704a9aa62bb5715a87ed62750e2c47042eafa476e0"} Dec 05 05:55:19 crc kubenswrapper[4865]: I1205 05:55:19.487563 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a8087a33-5a51-414f-9eea-6ffe70e9b9fe","Type":"ContainerStarted","Data":"6beed7e8651e6c05f3f29dd55c2c8c74c59191cfcae584e1081e15262ace1aaa"} Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.488170 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mds9l" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.603837 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.604061 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjbll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bdk79_openshift-marketplace(8ed66f14-1ac8-456b-b3bb-d909c0164767): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.605359 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bdk79" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.727218 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.727384 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sbw8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jjg7p_openshift-marketplace(f43580fe-7567-4fb7-b1fc-203bda11942a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:19 crc kubenswrapper[4865]: E1205 05:55:19.728689 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jjg7p" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.858616 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d6z25" event={"ID":"55de6799-e76b-4493-a007-49cd203e7573","Type":"ContainerStarted","Data":"6a78cc6d11e8db91f4dfe59f956c821c9c2dc7d515ff046faa3bb6cdb3030aaa"} Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.860421 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.860497 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.860527 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.863598 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6","Type":"ContainerStarted","Data":"98d83c724f1935f758507627e71472fa6b78130ed27078daf6b813701d7dd398"} Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.867655 4865 generic.go:334] "Generic (PLEG): container finished" podID="a8087a33-5a51-414f-9eea-6ffe70e9b9fe" containerID="7ffaebd285a3a85293cb8dcbb032752cb0dcd021820a69632600afaa2da55402" exitCode=0 Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.867857 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a8087a33-5a51-414f-9eea-6ffe70e9b9fe","Type":"ContainerDied","Data":"7ffaebd285a3a85293cb8dcbb032752cb0dcd021820a69632600afaa2da55402"} Dec 05 05:55:20 crc kubenswrapper[4865]: E1205 05:55:20.870314 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jjg7p" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" Dec 05 05:55:20 crc kubenswrapper[4865]: E1205 05:55:20.872204 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bdk79" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" Dec 05 05:55:20 crc kubenswrapper[4865]: I1205 05:55:20.988997 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=41.988981905 podStartE2EDuration="41.988981905s" podCreationTimestamp="2025-12-05 05:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:55:20.962645531 +0000 UTC m=+140.242656753" watchObservedRunningTime="2025-12-05 05:55:20.988981905 +0000 UTC m=+140.268993127" Dec 05 05:55:21 crc kubenswrapper[4865]: I1205 05:55:21.880748 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:21 crc kubenswrapper[4865]: I1205 05:55:21.880860 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.113897 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.310296 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kubelet-dir\") pod \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.310411 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kube-api-access\") pod \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\" (UID: \"a8087a33-5a51-414f-9eea-6ffe70e9b9fe\") " Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.310727 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a8087a33-5a51-414f-9eea-6ffe70e9b9fe" (UID: "a8087a33-5a51-414f-9eea-6ffe70e9b9fe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.310844 4865 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.315616 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a8087a33-5a51-414f-9eea-6ffe70e9b9fe" (UID: "a8087a33-5a51-414f-9eea-6ffe70e9b9fe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.412623 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8087a33-5a51-414f-9eea-6ffe70e9b9fe-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.885474 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a8087a33-5a51-414f-9eea-6ffe70e9b9fe","Type":"ContainerDied","Data":"6beed7e8651e6c05f3f29dd55c2c8c74c59191cfcae584e1081e15262ace1aaa"} Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.885575 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6beed7e8651e6c05f3f29dd55c2c8c74c59191cfcae584e1081e15262ace1aaa" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.885492 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.886252 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:22 crc kubenswrapper[4865]: I1205 05:55:22.886327 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:25 crc kubenswrapper[4865]: I1205 05:55:25.116443 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:25 crc kubenswrapper[4865]: I1205 05:55:25.116921 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:25 crc kubenswrapper[4865]: I1205 05:55:25.116477 4865 patch_prober.go:28] interesting pod/downloads-7954f5f757-d6z25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 05 05:55:25 crc kubenswrapper[4865]: I1205 05:55:25.116999 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d6z25" podUID="55de6799-e76b-4493-a007-49cd203e7573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 05 05:55:35 crc kubenswrapper[4865]: I1205 05:55:35.132100 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d6z25" Dec 05 05:55:41 crc kubenswrapper[4865]: I1205 05:55:41.048801 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 05:55:41 crc kubenswrapper[4865]: I1205 05:55:41.049431 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 05:55:43 crc kubenswrapper[4865]: I1205 05:55:43.318615 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82xnx"] Dec 05 05:55:55 crc kubenswrapper[4865]: E1205 05:55:55.034259 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 05:55:55 crc kubenswrapper[4865]: E1205 05:55:55.034986 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjbll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bdk79_openshift-marketplace(8ed66f14-1ac8-456b-b3bb-d909c0164767): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 05:55:55 crc kubenswrapper[4865]: E1205 05:55:55.036652 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bdk79" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.062660 4865 generic.go:334] "Generic (PLEG): container finished" podID="fc0b366c-dba6-4a98-8335-e5434858e367" containerID="6d16f849561442a73dc55c4f9b41f4cd0c34477750439c735b90057a99219344" exitCode=0 Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.062883 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjkk" event={"ID":"fc0b366c-dba6-4a98-8335-e5434858e367","Type":"ContainerDied","Data":"6d16f849561442a73dc55c4f9b41f4cd0c34477750439c735b90057a99219344"} Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.072244 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerStarted","Data":"e72dfa1bf633d205c717c4bf710c9ee04dabe452a7ecdca7037a2e683c708f97"} Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.075169 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerStarted","Data":"c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c"} Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.078188 4865 generic.go:334] "Generic (PLEG): container finished" podID="bbe8803a-815d-4318-bfaa-1949755ed910" containerID="9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27" exitCode=0 Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.078265 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mds9l" event={"ID":"bbe8803a-815d-4318-bfaa-1949755ed910","Type":"ContainerDied","Data":"9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27"} Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.081289 4865 generic.go:334] "Generic (PLEG): container finished" podID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerID="6efad480a2917cf54bb33cc53a81fca29bfc02b19668c477c0f95e27251a0be9" exitCode=0 Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.081386 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zhc2" event={"ID":"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e","Type":"ContainerDied","Data":"6efad480a2917cf54bb33cc53a81fca29bfc02b19668c477c0f95e27251a0be9"} Dec 05 05:55:56 crc kubenswrapper[4865]: I1205 05:55:56.085447 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerStarted","Data":"f5cf0d4b49e34fc58e891af15f631790d507b6ac2f15c09a3c2df944ee623757"} Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.098887 4865 generic.go:334] "Generic (PLEG): container finished" podID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerID="e72dfa1bf633d205c717c4bf710c9ee04dabe452a7ecdca7037a2e683c708f97" exitCode=0 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.099029 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerDied","Data":"e72dfa1bf633d205c717c4bf710c9ee04dabe452a7ecdca7037a2e683c708f97"} Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.102810 4865 generic.go:334] "Generic (PLEG): container finished" podID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerID="f5cf0d4b49e34fc58e891af15f631790d507b6ac2f15c09a3c2df944ee623757" exitCode=0 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.102887 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerDied","Data":"f5cf0d4b49e34fc58e891af15f631790d507b6ac2f15c09a3c2df944ee623757"} Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.105926 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerStarted","Data":"1eca415c9add728d8c06d6db223e2020eb15cc5dce769a80666b4389c04354ec"} Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.130773 4865 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.131158 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077" gracePeriod=15 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.131331 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867" gracePeriod=15 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.131399 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75" gracePeriod=15 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.131471 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb" gracePeriod=15 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.131516 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1" gracePeriod=15 Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.131884 4865 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132158 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132185 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132203 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132213 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132227 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132236 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132252 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132260 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132270 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132277 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132288 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132313 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132323 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132330 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132342 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8087a33-5a51-414f-9eea-6ffe70e9b9fe" containerName="pruner" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132349 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8087a33-5a51-414f-9eea-6ffe70e9b9fe" containerName="pruner" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.132361 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132370 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132495 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132511 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132521 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8087a33-5a51-414f-9eea-6ffe70e9b9fe" containerName="pruner" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132530 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132540 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132550 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3633127-3192-43e8-87a2-5049b2d82fa6" containerName="kube-multus-additional-cni-plugins" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132563 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.132574 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.176339 4865 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.177464 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.183536 4865 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.235679 4865 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.147:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295434 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295506 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295534 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295559 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295611 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295628 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295647 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.295666 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397244 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397301 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397320 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397338 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397388 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397404 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397423 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397444 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397510 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397549 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397571 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397623 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397644 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397667 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397686 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.397705 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.536748 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:57 crc kubenswrapper[4865]: W1205 05:55:57.554261 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-3ce185013a99b49c3b9562c4391e9294ee1fcc1883fd081512381699d51d6585 WatchSource:0}: Error finding container 3ce185013a99b49c3b9562c4391e9294ee1fcc1883fd081512381699d51d6585: Status 404 returned error can't find the container with id 3ce185013a99b49c3b9562c4391e9294ee1fcc1883fd081512381699d51d6585 Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.556440 4865 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.147:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e3c134c2f8c3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 05:55:57.556042812 +0000 UTC m=+176.836054034,LastTimestamp:2025-12-05 05:55:57.556042812 +0000 UTC m=+176.836054034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.721450 4865 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.722279 4865 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.723043 4865 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.723314 4865 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.723560 4865 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:57 crc kubenswrapper[4865]: I1205 05:55:57.723592 4865 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.724066 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="200ms" Dec 05 05:55:57 crc kubenswrapper[4865]: E1205 05:55:57.925501 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="400ms" Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.113625 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjkk" event={"ID":"fc0b366c-dba6-4a98-8335-e5434858e367","Type":"ContainerStarted","Data":"7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583"} Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.115917 4865 generic.go:334] "Generic (PLEG): container finished" podID="582e42c0-b2d0-4b24-900e-1316a155c471" containerID="1eca415c9add728d8c06d6db223e2020eb15cc5dce769a80666b4389c04354ec" exitCode=0 Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.116015 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerDied","Data":"1eca415c9add728d8c06d6db223e2020eb15cc5dce769a80666b4389c04354ec"} Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.116784 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.118482 4865 generic.go:334] "Generic (PLEG): container finished" podID="748082c2-70ae-4b67-9c21-ff6f32030822" containerID="c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c" exitCode=0 Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.118562 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerDied","Data":"c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c"} Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.119644 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.119947 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:58 crc kubenswrapper[4865]: I1205 05:55:58.120333 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3ce185013a99b49c3b9562c4391e9294ee1fcc1883fd081512381699d51d6585"} Dec 05 05:55:58 crc kubenswrapper[4865]: E1205 05:55:58.326946 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="800ms" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.126699 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d"} Dec 05 05:55:59 crc kubenswrapper[4865]: E1205 05:55:59.127447 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="1.6s" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.127448 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: E1205 05:55:59.127517 4865 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.147:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.127646 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.128600 4865 generic.go:334] "Generic (PLEG): container finished" podID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" containerID="98d83c724f1935f758507627e71472fa6b78130ed27078daf6b813701d7dd398" exitCode=0 Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.128658 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6","Type":"ContainerDied","Data":"98d83c724f1935f758507627e71472fa6b78130ed27078daf6b813701d7dd398"} Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.129412 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.129758 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.129987 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.131975 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerStarted","Data":"94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a"} Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.132394 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.132675 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.133009 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.133244 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.134132 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mds9l" event={"ID":"bbe8803a-815d-4318-bfaa-1949755ed910","Type":"ContainerStarted","Data":"56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b"} Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.134599 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.134929 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.135194 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.135425 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.135606 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.136870 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zhc2" event={"ID":"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e","Type":"ContainerStarted","Data":"4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48"} Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.137370 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.137901 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.138252 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.139035 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.139659 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.139870 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.140098 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.141552 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.142746 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867" exitCode=0 Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.142876 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75" exitCode=0 Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.142960 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb" exitCode=0 Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.143041 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1" exitCode=2 Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.142857 4865 scope.go:117] "RemoveContainer" containerID="6cd4f8940401d4b905b9766810dea6fd3d7543e9d0b204d61a8de150415d2644" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.144072 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.144331 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.144591 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.144836 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.145049 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.145309 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:55:59 crc kubenswrapper[4865]: I1205 05:55:59.145646 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: E1205 05:56:00.148348 4865 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.147:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.514759 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.515292 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.515695 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.516057 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.516324 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.516521 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.516717 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.516946 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.641965 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kubelet-dir\") pod \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.642022 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kube-api-access\") pod \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.642056 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-var-lock\") pod \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\" (UID: \"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6\") " Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.642481 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-var-lock" (OuterVolumeSpecName: "var-lock") pod "0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" (UID: "0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.642513 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" (UID: "0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.650051 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" (UID: "0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:56:00 crc kubenswrapper[4865]: E1205 05:56:00.728712 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="3.2s" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.743632 4865 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.743660 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:00 crc kubenswrapper[4865]: I1205 05:56:00.743672 4865 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.008398 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.008645 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.008837 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.009044 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.009216 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.009377 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.009547 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.155139 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.155744 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077" exitCode=0 Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.157275 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6","Type":"ContainerDied","Data":"40a4d4921b905019f27448704a9aa62bb5715a87ed62750e2c47042eafa476e0"} Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.157311 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40a4d4921b905019f27448704a9aa62bb5715a87ed62750e2c47042eafa476e0" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.157340 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.161103 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.161503 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.161916 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.162278 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.162539 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.162766 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:01 crc kubenswrapper[4865]: I1205 05:56:01.162996 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.434465 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.436111 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.437333 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.438145 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.439970 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.440450 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.441401 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.442143 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.442476 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.442881 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: E1205 05:56:02.480265 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:02Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:02Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:02Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:02Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a929531bb959f0b8fee26224ee1c20db089abfeca0140403ae1f0c3363ef71d1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:f8716572be76ae0a4e79f51c5a917183459b6b2ceacbd574fe24b5a9c15805b1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1208070485},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: E1205 05:56:02.480889 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: E1205 05:56:02.481169 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: E1205 05:56:02.481488 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: E1205 05:56:02.481883 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: E1205 05:56:02.481914 4865 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.513549 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.513614 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.524658 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.524854 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.569757 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.569878 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.569989 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.570186 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.570200 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.570239 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.590153 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.591140 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.591132 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.591665 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.591873 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.592081 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.592394 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.592728 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.592939 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.593307 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.593747 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.593958 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.594259 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.594530 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.594720 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.595089 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.595459 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.595803 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.671570 4865 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.671610 4865 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:02 crc kubenswrapper[4865]: I1205 05:56:02.671620 4865 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.012624 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.171744 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.172571 4865 scope.go:117] "RemoveContainer" containerID="fc6942270139af3934b85e9a4f963915463acb7bceddc1bfa00c229935877867" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.172597 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.173922 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.174442 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.174673 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.175072 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.175422 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.175753 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.176128 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.176400 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.180576 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.180807 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.181053 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.181210 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.181383 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.181594 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.181754 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.182080 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.218304 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.218786 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.219123 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.219329 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.219486 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.219635 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.219804 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.219990 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.220145 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.593021 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.593070 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.633105 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.633476 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.633732 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.634002 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.634180 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.634340 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.634488 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.634641 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.634800 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.685991 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.686027 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.727948 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.728416 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.728745 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.729023 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.729343 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.729623 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.729870 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.730126 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.730423 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.846096 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.846591 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.848982 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.849401 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.849762 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.850237 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.850510 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.850810 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: I1205 05:56:03.851048 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:03 crc kubenswrapper[4865]: E1205 05:56:03.930400 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="6.4s" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.224445 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.225044 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.225239 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.225393 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.225557 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.225716 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.225920 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.226010 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.226070 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.226299 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.226818 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.227187 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.227659 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.228035 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.228336 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.228689 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.229023 4865 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:04 crc kubenswrapper[4865]: I1205 05:56:04.229226 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:05 crc kubenswrapper[4865]: E1205 05:56:05.015590 4865 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.147:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e3c134c2f8c3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 05:55:57.556042812 +0000 UTC m=+176.836054034,LastTimestamp:2025-12-05 05:55:57.556042812 +0000 UTC m=+176.836054034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.006945 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.007654 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.008077 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.008375 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.008602 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.008854 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.009044 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.009256 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:07 crc kubenswrapper[4865]: I1205 05:56:07.707418 4865 scope.go:117] "RemoveContainer" containerID="0ff59482544992667891e11e736e2ff3c3c093fdc68505dbddca74bd065bbc75" Dec 05 05:56:08 crc kubenswrapper[4865]: I1205 05:56:08.202896 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 05:56:08 crc kubenswrapper[4865]: I1205 05:56:08.344693 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" containerID="cri-o://e83abea23c54966ab2a465c991a97efda4b707d5d861cbf48efeff3081a32f03" gracePeriod=15 Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.006050 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.006940 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.007295 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.007496 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.007644 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.007796 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.007970 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.008121 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.008267 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.022576 4865 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.022607 4865 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:09 crc kubenswrapper[4865]: E1205 05:56:09.022953 4865 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.023438 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.025490 4865 scope.go:117] "RemoveContainer" containerID="81b5ebabadb97cd677f48cadea47e96aeddeb43f21218d18a8fc35d2f80694cb" Dec 05 05:56:09 crc kubenswrapper[4865]: E1205 05:56:09.043150 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bdk79" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" Dec 05 05:56:09 crc kubenswrapper[4865]: I1205 05:56:09.210355 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 05:56:10 crc kubenswrapper[4865]: I1205 05:56:10.228409 4865 generic.go:334] "Generic (PLEG): container finished" podID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerID="e83abea23c54966ab2a465c991a97efda4b707d5d861cbf48efeff3081a32f03" exitCode=0 Dec 05 05:56:10 crc kubenswrapper[4865]: I1205 05:56:10.228480 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" event={"ID":"87a82cae-057c-47d5-9703-eb48128e1bd9","Type":"ContainerDied","Data":"e83abea23c54966ab2a465c991a97efda4b707d5d861cbf48efeff3081a32f03"} Dec 05 05:56:10 crc kubenswrapper[4865]: E1205 05:56:10.331565 4865 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" interval="7s" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.012433 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.013183 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.013404 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.013772 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.014265 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.014587 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.014800 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.015067 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.015338 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.048764 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.048843 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.225929 4865 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.225981 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.239979 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.240035 4865 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d" exitCode=1 Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.240068 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d"} Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.240564 4865 scope.go:117] "RemoveContainer" containerID="c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.241186 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.242173 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.242490 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.242735 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.243083 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.243353 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.243691 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.244040 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.244391 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.244685 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.574231 4865 scope.go:117] "RemoveContainer" containerID="67dcec4a9a0ff2f2778e848f4d11a689497c63530d2a950e74ef11f7405aeff1" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.644374 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.645176 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.645463 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.645657 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.645840 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.646022 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.646178 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.646318 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.646615 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.647296 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.647543 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.647777 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.650612 4865 scope.go:117] "RemoveContainer" containerID="644d5d1cdf80e1e990ac2681ed71aa80a5197b2886d66f5bc92ce197c3c57077" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.684535 4865 scope.go:117] "RemoveContainer" containerID="7b41e449fb8368c47471e8c81ee727a6c8346cd627097d6e50d17234505b1298" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701392 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-error\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701686 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-service-ca\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701716 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szwj6\" (UniqueName: \"kubernetes.io/projected/87a82cae-057c-47d5-9703-eb48128e1bd9-kube-api-access-szwj6\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701748 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-ocp-branding-template\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701776 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-session\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701868 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-trusted-ca-bundle\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701899 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-router-certs\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701934 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-dir\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701965 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-idp-0-file-data\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.701988 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-login\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.702060 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-serving-cert\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.702102 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-cliconfig\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.702123 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-provider-selection\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.702187 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-policies\") pod \"87a82cae-057c-47d5-9703-eb48128e1bd9\" (UID: \"87a82cae-057c-47d5-9703-eb48128e1bd9\") " Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.703089 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.703109 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.703179 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.704161 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.704533 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.712153 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.712551 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.712833 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.714489 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.714790 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.715070 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a82cae-057c-47d5-9703-eb48128e1bd9-kube-api-access-szwj6" (OuterVolumeSpecName: "kube-api-access-szwj6") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "kube-api-access-szwj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.715491 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.715847 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.715938 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "87a82cae-057c-47d5-9703-eb48128e1bd9" (UID: "87a82cae-057c-47d5-9703-eb48128e1bd9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804337 4865 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804374 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804385 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804396 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804407 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804416 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804426 4865 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804434 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804444 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804453 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szwj6\" (UniqueName: \"kubernetes.io/projected/87a82cae-057c-47d5-9703-eb48128e1bd9-kube-api-access-szwj6\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804463 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804472 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804481 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:11 crc kubenswrapper[4865]: I1205 05:56:11.804489 4865 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87a82cae-057c-47d5-9703-eb48128e1bd9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.247344 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" event={"ID":"87a82cae-057c-47d5-9703-eb48128e1bd9","Type":"ContainerDied","Data":"ae12a8c637b786de5c1a85a635492bef0131458baa06c17017a905ebce3f328f"} Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.247398 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.247412 4865 scope.go:117] "RemoveContainer" containerID="e83abea23c54966ab2a465c991a97efda4b707d5d861cbf48efeff3081a32f03" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.248259 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.248492 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c4f2da8163d75343a39edae26b2c71566d817129d29ff298a959a469c71af0f"} Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.248630 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.248987 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.249263 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.249614 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.249997 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.250235 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.250574 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.250965 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.251284 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.251625 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.273309 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.273962 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.274197 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.274399 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.274603 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.274814 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.275157 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.275536 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.275881 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.276202 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: I1205 05:56:12.276539 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: E1205 05:56:12.532864 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T05:56:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a929531bb959f0b8fee26224ee1c20db089abfeca0140403ae1f0c3363ef71d1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:f8716572be76ae0a4e79f51c5a917183459b6b2ceacbd574fe24b5a9c15805b1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1208070485},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: E1205 05:56:12.533604 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: E1205 05:56:12.533863 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: E1205 05:56:12.534079 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: E1205 05:56:12.534282 4865 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:12 crc kubenswrapper[4865]: E1205 05:56:12.534300 4865 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 05:56:13 crc kubenswrapper[4865]: I1205 05:56:13.258455 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 05:56:13 crc kubenswrapper[4865]: I1205 05:56:13.258749 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"374aabfce78bfe0ffbc1417ac3b56335f117e39c8a581823fa115d74e622cde8"} Dec 05 05:56:13 crc kubenswrapper[4865]: I1205 05:56:13.759897 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.266966 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c5f937aae09e9b9f95d3a8b28468cdd1c2d1656af9a45442fcadd9d1ac7377d"} Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.270002 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerStarted","Data":"70d23a6d79d5b5a207c36677970567804dd0168a20b9e0f576f951f830d7eaae"} Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.270962 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.271293 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.271468 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.271612 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.271775 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.271970 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.272174 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.272340 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.273197 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.273514 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:14 crc kubenswrapper[4865]: I1205 05:56:14.273948 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:15 crc kubenswrapper[4865]: E1205 05:56:15.018256 4865 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.147:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e3c134c2f8c3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 05:55:57.556042812 +0000 UTC m=+176.836054034,LastTimestamp:2025-12-05 05:55:57.556042812 +0000 UTC m=+176.836054034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 05:56:15 crc kubenswrapper[4865]: I1205 05:56:15.878609 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:56:15 crc kubenswrapper[4865]: I1205 05:56:15.878763 4865 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 05:56:15 crc kubenswrapper[4865]: I1205 05:56:15.879272 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.282305 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerStarted","Data":"c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab"} Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.283413 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.283673 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.284037 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.284250 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.284458 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.284677 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.284741 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerStarted","Data":"b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179"} Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.284904 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.285115 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.285272 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.285444 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.285667 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.285962 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.286156 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.286365 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.286576 4865 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7c5f937aae09e9b9f95d3a8b28468cdd1c2d1656af9a45442fcadd9d1ac7377d" exitCode=0 Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.286670 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.286853 4865 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.286871 4865 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.287015 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7c5f937aae09e9b9f95d3a8b28468cdd1c2d1656af9a45442fcadd9d1ac7377d"} Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.287092 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: E1205 05:56:16.287126 4865 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.287282 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.287492 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.287652 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.287904 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.288315 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.288612 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.289147 4865 status_manager.go:851] "Failed to get status for pod" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" pod="openshift-marketplace/certified-operators-5zhc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5zhc2\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.289410 4865 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.289750 4865 status_manager.go:851] "Failed to get status for pod" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" pod="openshift-marketplace/redhat-operators-rw9pr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rw9pr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.290100 4865 status_manager.go:851] "Failed to get status for pod" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.290365 4865 status_manager.go:851] "Failed to get status for pod" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" pod="openshift-marketplace/community-operators-bdk79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-bdk79\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.290801 4865 status_manager.go:851] "Failed to get status for pod" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" pod="openshift-authentication/oauth-openshift-558db77b4-82xnx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-82xnx\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.291166 4865 status_manager.go:851] "Failed to get status for pod" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" pod="openshift-marketplace/redhat-marketplace-mds9l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mds9l\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.291460 4865 status_manager.go:851] "Failed to get status for pod" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" pod="openshift-marketplace/redhat-operators-xljcc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xljcc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.291848 4865 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.292200 4865 status_manager.go:851] "Failed to get status for pod" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" pod="openshift-marketplace/certified-operators-fsjkk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fsjkk\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.292539 4865 status_manager.go:851] "Failed to get status for pod" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" pod="openshift-marketplace/community-operators-ccqrr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ccqrr\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:16 crc kubenswrapper[4865]: I1205 05:56:16.293051 4865 status_manager.go:851] "Failed to get status for pod" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" pod="openshift-marketplace/redhat-marketplace-jjg7p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-jjg7p\": dial tcp 38.102.83.147:6443: connect: connection refused" Dec 05 05:56:17 crc kubenswrapper[4865]: I1205 05:56:17.297799 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"941fe67e4cc2823e376c972d34dd2f05b9f6db6e7f4b699fd5325c8b7492879d"} Dec 05 05:56:17 crc kubenswrapper[4865]: I1205 05:56:17.298290 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1b48c070c5d2abb44cb3d7988589df9412fd4add5c874dc0bfe0f5a461b5ef44"} Dec 05 05:56:17 crc kubenswrapper[4865]: I1205 05:56:17.298301 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"32ecc200df15379309ef5aa9d6810a92c6d4758287099cff83fc856b68a9d056"} Dec 05 05:56:17 crc kubenswrapper[4865]: I1205 05:56:17.298310 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"31b58154cae5cb8a4e6e1df1373b57226ea0e5276f23ea811f6c251e7179dd97"} Dec 05 05:56:18 crc kubenswrapper[4865]: I1205 05:56:18.307521 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ecc58ca7e6c83d6a5f835a519117765441541dce5c6b44a5a8fd5373b8911c26"} Dec 05 05:56:18 crc kubenswrapper[4865]: I1205 05:56:18.307681 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:18 crc kubenswrapper[4865]: I1205 05:56:18.307814 4865 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:18 crc kubenswrapper[4865]: I1205 05:56:18.307868 4865 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:19 crc kubenswrapper[4865]: I1205 05:56:19.486552 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:19 crc kubenswrapper[4865]: I1205 05:56:19.486854 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:19 crc kubenswrapper[4865]: I1205 05:56:19.510396 4865 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]log ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]etcd ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-apiextensions-informers ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/crd-informer-synced ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 05:56:19 crc kubenswrapper[4865]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 05 05:56:19 crc kubenswrapper[4865]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/bootstrap-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-registration-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]autoregister-completion ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 05:56:19 crc kubenswrapper[4865]: livez check failed Dec 05 05:56:19 crc kubenswrapper[4865]: I1205 05:56:19.510465 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 05:56:21 crc kubenswrapper[4865]: I1205 05:56:21.224906 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:56:22 crc kubenswrapper[4865]: I1205 05:56:22.232043 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:56:22 crc kubenswrapper[4865]: I1205 05:56:22.233566 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:56:22 crc kubenswrapper[4865]: I1205 05:56:22.279645 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:56:22 crc kubenswrapper[4865]: I1205 05:56:22.531595 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:56:23 crc kubenswrapper[4865]: I1205 05:56:23.488497 4865 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:56:23 crc kubenswrapper[4865]: I1205 05:56:23.689149 4865 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="407e00f5-c4cc-4b65-993d-85b401f50d97" Dec 05 05:56:24 crc kubenswrapper[4865]: I1205 05:56:24.504464 4865 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:24 crc kubenswrapper[4865]: I1205 05:56:24.504521 4865 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:56:24 crc kubenswrapper[4865]: I1205 05:56:24.511493 4865 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="407e00f5-c4cc-4b65-993d-85b401f50d97" Dec 05 05:56:24 crc kubenswrapper[4865]: I1205 05:56:24.995358 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:56:24 crc kubenswrapper[4865]: I1205 05:56:24.995410 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.033709 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.509114 4865 generic.go:334] "Generic (PLEG): container finished" podID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerID="c963e1428a698cf16325285b9b17d4c3d8ab4797a22e9e9c9dc2fdd02caa6c77" exitCode=0 Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.509783 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerDied","Data":"c963e1428a698cf16325285b9b17d4c3d8ab4797a22e9e9c9dc2fdd02caa6c77"} Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.556280 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.595325 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.595424 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.664340 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.878780 4865 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 05:56:25 crc kubenswrapper[4865]: I1205 05:56:25.878844 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 05:56:26 crc kubenswrapper[4865]: I1205 05:56:26.516951 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerStarted","Data":"977be8a03e1dbfc0f779f1f6e113f33e7655f330bb6e84aaca5e05932fea6df6"} Dec 05 05:56:26 crc kubenswrapper[4865]: I1205 05:56:26.564204 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:56:31 crc kubenswrapper[4865]: I1205 05:56:31.790592 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:56:31 crc kubenswrapper[4865]: I1205 05:56:31.791846 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:56:31 crc kubenswrapper[4865]: I1205 05:56:31.835907 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:56:32 crc kubenswrapper[4865]: I1205 05:56:32.596691 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:56:33 crc kubenswrapper[4865]: I1205 05:56:33.025428 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 05:56:33 crc kubenswrapper[4865]: I1205 05:56:33.623455 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 05:56:33 crc kubenswrapper[4865]: I1205 05:56:33.637129 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 05:56:34 crc kubenswrapper[4865]: I1205 05:56:34.152702 4865 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 05:56:34 crc kubenswrapper[4865]: I1205 05:56:34.474369 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 05:56:34 crc kubenswrapper[4865]: I1205 05:56:34.793763 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 05:56:34 crc kubenswrapper[4865]: I1205 05:56:34.849230 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 05:56:34 crc kubenswrapper[4865]: I1205 05:56:34.989263 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 05:56:35 crc kubenswrapper[4865]: I1205 05:56:35.573426 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 05:56:35 crc kubenswrapper[4865]: I1205 05:56:35.879108 4865 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 05:56:35 crc kubenswrapper[4865]: I1205 05:56:35.879168 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 05:56:35 crc kubenswrapper[4865]: I1205 05:56:35.879229 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:56:35 crc kubenswrapper[4865]: I1205 05:56:35.879978 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"374aabfce78bfe0ffbc1417ac3b56335f117e39c8a581823fa115d74e622cde8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 05:56:35 crc kubenswrapper[4865]: I1205 05:56:35.880160 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://374aabfce78bfe0ffbc1417ac3b56335f117e39c8a581823fa115d74e622cde8" gracePeriod=30 Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.134951 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.283859 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.307234 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.389806 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.451994 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.517103 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 05:56:36 crc kubenswrapper[4865]: I1205 05:56:36.630137 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.121746 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.376296 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.421558 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.451537 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.469339 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.631651 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.652383 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.660342 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.761735 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.767436 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.775723 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.824258 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 05:56:37 crc kubenswrapper[4865]: I1205 05:56:37.876777 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.012754 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.106276 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.128620 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.192360 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.403320 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.487644 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.699225 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.813892 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.862048 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 05:56:38 crc kubenswrapper[4865]: I1205 05:56:38.940081 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.404134 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.466030 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.659078 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.713866 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.726366 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.799914 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 05:56:39 crc kubenswrapper[4865]: I1205 05:56:39.934434 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.006127 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.034304 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.071188 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.172252 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.222697 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.272034 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.310736 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.507780 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.594498 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.599813 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.708249 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.865592 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.971688 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 05:56:40 crc kubenswrapper[4865]: I1205 05:56:40.978977 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.012321 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.020851 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.048678 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.048763 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.048884 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.050208 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.050305 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf" gracePeriod=600 Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.114723 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.207601 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.368493 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.536041 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.610346 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf" exitCode=0 Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.610395 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf"} Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.610427 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"c37cd466671a814dc7fd213e210192f3341c2133e9a2d5a7ced242665a144318"} Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.611777 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.632463 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.752186 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.837346 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.840743 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.886006 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 05:56:41 crc kubenswrapper[4865]: I1205 05:56:41.891334 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.035157 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.096581 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.223789 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.355604 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.395395 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.697573 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.810174 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.829150 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.886559 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 05:56:42 crc kubenswrapper[4865]: I1205 05:56:42.951586 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 05:56:43 crc kubenswrapper[4865]: I1205 05:56:43.170608 4865 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 05:56:43 crc kubenswrapper[4865]: I1205 05:56:43.382369 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 05:56:43 crc kubenswrapper[4865]: I1205 05:56:43.479339 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 05:56:43 crc kubenswrapper[4865]: I1205 05:56:43.743458 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 05:56:43 crc kubenswrapper[4865]: I1205 05:56:43.761024 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.015484 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.065987 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.121392 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.140341 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.269863 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.311008 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.833443 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.834298 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 05:56:44 crc kubenswrapper[4865]: I1205 05:56:44.941172 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.110485 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.156444 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.163168 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.204982 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.352211 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.397442 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.503025 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.537635 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.641412 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 05:56:45 crc kubenswrapper[4865]: I1205 05:56:45.873933 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.188274 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.199819 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.276498 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.418267 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.638958 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.719658 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.784973 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 05:56:46 crc kubenswrapper[4865]: I1205 05:56:46.928634 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 05:56:47 crc kubenswrapper[4865]: I1205 05:56:47.088511 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 05:56:47 crc kubenswrapper[4865]: I1205 05:56:47.269750 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 05:56:47 crc kubenswrapper[4865]: I1205 05:56:47.464863 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 05:56:47 crc kubenswrapper[4865]: I1205 05:56:47.803256 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 05:56:47 crc kubenswrapper[4865]: I1205 05:56:47.849201 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.146065 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.227098 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.590091 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.635114 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.674680 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.815699 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 05:56:48 crc kubenswrapper[4865]: I1205 05:56:48.896589 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 05:56:49 crc kubenswrapper[4865]: I1205 05:56:49.127739 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 05:56:49 crc kubenswrapper[4865]: I1205 05:56:49.524032 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 05:56:49 crc kubenswrapper[4865]: I1205 05:56:49.578868 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 05:56:49 crc kubenswrapper[4865]: I1205 05:56:49.767879 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 05:56:49 crc kubenswrapper[4865]: I1205 05:56:49.925849 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 05:56:54 crc kubenswrapper[4865]: I1205 05:56:54.864552 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 05:56:55 crc kubenswrapper[4865]: I1205 05:56:55.550785 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 05:56:57 crc kubenswrapper[4865]: I1205 05:56:57.383011 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 05:56:57 crc kubenswrapper[4865]: I1205 05:56:57.650888 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 05:56:57 crc kubenswrapper[4865]: I1205 05:56:57.792500 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 05:56:58 crc kubenswrapper[4865]: I1205 05:56:58.800558 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 05:56:59 crc kubenswrapper[4865]: I1205 05:56:59.248905 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 05:57:00 crc kubenswrapper[4865]: I1205 05:57:00.590432 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 05:57:00 crc kubenswrapper[4865]: I1205 05:57:00.653587 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 05:57:00 crc kubenswrapper[4865]: I1205 05:57:00.718978 4865 generic.go:334] "Generic (PLEG): container finished" podID="1f49a368-065d-4057-a044-a019eba9ce9e" containerID="420b599485c42d5136bba3e581869141e0c3970fbbaf3e429e65daa33707125d" exitCode=0 Dec 05 05:57:00 crc kubenswrapper[4865]: I1205 05:57:00.719035 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" event={"ID":"1f49a368-065d-4057-a044-a019eba9ce9e","Type":"ContainerDied","Data":"420b599485c42d5136bba3e581869141e0c3970fbbaf3e429e65daa33707125d"} Dec 05 05:57:00 crc kubenswrapper[4865]: I1205 05:57:00.719632 4865 scope.go:117] "RemoveContainer" containerID="420b599485c42d5136bba3e581869141e0c3970fbbaf3e429e65daa33707125d" Dec 05 05:57:01 crc kubenswrapper[4865]: I1205 05:57:01.346404 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 05:57:01 crc kubenswrapper[4865]: I1205 05:57:01.348409 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 05:57:01 crc kubenswrapper[4865]: I1205 05:57:01.407232 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 05:57:01 crc kubenswrapper[4865]: I1205 05:57:01.725466 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" event={"ID":"1f49a368-065d-4057-a044-a019eba9ce9e","Type":"ContainerStarted","Data":"68ab0d28476f44451dadc0f67a3d1bc6c68028ad03f8baca0545f8a47f6903e2"} Dec 05 05:57:01 crc kubenswrapper[4865]: I1205 05:57:01.725954 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:57:01 crc kubenswrapper[4865]: I1205 05:57:01.736986 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:57:02 crc kubenswrapper[4865]: I1205 05:57:02.231439 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 05:57:02 crc kubenswrapper[4865]: I1205 05:57:02.706529 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 05:57:03 crc kubenswrapper[4865]: I1205 05:57:03.228236 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 05:57:03 crc kubenswrapper[4865]: I1205 05:57:03.831610 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 05:57:03 crc kubenswrapper[4865]: I1205 05:57:03.961460 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 05:57:04 crc kubenswrapper[4865]: I1205 05:57:04.318886 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 05:57:04 crc kubenswrapper[4865]: I1205 05:57:04.562046 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 05:57:04 crc kubenswrapper[4865]: I1205 05:57:04.779029 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 05:57:04 crc kubenswrapper[4865]: I1205 05:57:04.853964 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.094720 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.204121 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.354924 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.539662 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.686133 4865 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.687768 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jjg7p" podStartSLOduration=75.577220311 podStartE2EDuration="3m13.687743417s" podCreationTimestamp="2025-12-05 05:53:52 +0000 UTC" firstStartedPulling="2025-12-05 05:53:58.432196482 +0000 UTC m=+57.712207704" lastFinishedPulling="2025-12-05 05:55:56.542719588 +0000 UTC m=+175.822730810" observedRunningTime="2025-12-05 05:56:23.764596514 +0000 UTC m=+203.044607736" watchObservedRunningTime="2025-12-05 05:57:05.687743417 +0000 UTC m=+244.967754649" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.689655 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xljcc" podStartSLOduration=59.722291804 podStartE2EDuration="3m11.689646101s" podCreationTimestamp="2025-12-05 05:53:54 +0000 UTC" firstStartedPulling="2025-12-05 05:53:59.607966621 +0000 UTC m=+58.887977843" lastFinishedPulling="2025-12-05 05:56:11.575320918 +0000 UTC m=+190.855332140" observedRunningTime="2025-12-05 05:56:23.70960807 +0000 UTC m=+202.989619292" watchObservedRunningTime="2025-12-05 05:57:05.689646101 +0000 UTC m=+244.969657333" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.691532 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5zhc2" podStartSLOduration=76.448279779 podStartE2EDuration="3m14.691519214s" podCreationTimestamp="2025-12-05 05:53:51 +0000 UTC" firstStartedPulling="2025-12-05 05:53:58.364059978 +0000 UTC m=+57.644071200" lastFinishedPulling="2025-12-05 05:55:56.607299413 +0000 UTC m=+175.887310635" observedRunningTime="2025-12-05 05:56:23.530956712 +0000 UTC m=+202.810967944" watchObservedRunningTime="2025-12-05 05:57:05.691519214 +0000 UTC m=+244.971530446" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.691853 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bdk79" podStartSLOduration=44.267927441 podStartE2EDuration="3m14.691811522s" podCreationTimestamp="2025-12-05 05:53:51 +0000 UTC" firstStartedPulling="2025-12-05 05:53:55.767556419 +0000 UTC m=+55.047567641" lastFinishedPulling="2025-12-05 05:56:26.19144048 +0000 UTC m=+205.471451722" observedRunningTime="2025-12-05 05:56:26.534191635 +0000 UTC m=+205.814202857" watchObservedRunningTime="2025-12-05 05:57:05.691811522 +0000 UTC m=+244.971822794" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.692387 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fsjkk" podStartSLOduration=77.593785305 podStartE2EDuration="3m15.692373748s" podCreationTimestamp="2025-12-05 05:53:50 +0000 UTC" firstStartedPulling="2025-12-05 05:53:58.394478842 +0000 UTC m=+57.674490064" lastFinishedPulling="2025-12-05 05:55:56.493067285 +0000 UTC m=+175.773078507" observedRunningTime="2025-12-05 05:56:23.727349462 +0000 UTC m=+203.007360684" watchObservedRunningTime="2025-12-05 05:57:05.692373748 +0000 UTC m=+244.972384980" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.693133 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ccqrr" podStartSLOduration=55.03545341 podStartE2EDuration="3m14.693126169s" podCreationTimestamp="2025-12-05 05:53:51 +0000 UTC" firstStartedPulling="2025-12-05 05:53:55.754793957 +0000 UTC m=+55.034805189" lastFinishedPulling="2025-12-05 05:56:15.412466726 +0000 UTC m=+194.692477948" observedRunningTime="2025-12-05 05:56:23.74676788 +0000 UTC m=+203.026779112" watchObservedRunningTime="2025-12-05 05:57:05.693126169 +0000 UTC m=+244.973137411" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.693346 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mds9l" podStartSLOduration=74.606985263 podStartE2EDuration="3m12.693339705s" podCreationTimestamp="2025-12-05 05:53:53 +0000 UTC" firstStartedPulling="2025-12-05 05:53:58.443198534 +0000 UTC m=+57.723209756" lastFinishedPulling="2025-12-05 05:55:56.529552976 +0000 UTC m=+175.809564198" observedRunningTime="2025-12-05 05:56:23.684945313 +0000 UTC m=+202.964956545" watchObservedRunningTime="2025-12-05 05:57:05.693339705 +0000 UTC m=+244.973350957" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.693538 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rw9pr" podStartSLOduration=56.997289509 podStartE2EDuration="3m11.693530271s" podCreationTimestamp="2025-12-05 05:53:54 +0000 UTC" firstStartedPulling="2025-12-05 05:54:00.715691289 +0000 UTC m=+59.995702511" lastFinishedPulling="2025-12-05 05:56:15.411932051 +0000 UTC m=+194.691943273" observedRunningTime="2025-12-05 05:56:23.598211302 +0000 UTC m=+202.878222524" watchObservedRunningTime="2025-12-05 05:57:05.693530271 +0000 UTC m=+244.973541513" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.694625 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82xnx","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.694788 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-vvtgp","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.695408 4865 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:57:05 crc kubenswrapper[4865]: E1205 05:57:05.695514 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" containerName="installer" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.695566 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" containerName="installer" Dec 05 05:57:05 crc kubenswrapper[4865]: E1205 05:57:05.695596 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.695607 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.695759 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" containerName="oauth-openshift" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.695783 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c32bdbe-2acd-4833-b0ca-a9b9eecae8e6" containerName="installer" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.695521 4865 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ed818037-beb4-4918-a648-c51549a1b8dc" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.696409 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.698890 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.699006 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.699092 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.700419 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.702024 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.702156 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.702554 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.702999 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.704067 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.704123 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.704256 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.704398 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.704483 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.708872 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.711165 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.717224 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.725232 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=42.725216366 podStartE2EDuration="42.725216366s" podCreationTimestamp="2025-12-05 05:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:57:05.723434686 +0000 UTC m=+245.003445968" watchObservedRunningTime="2025-12-05 05:57:05.725216366 +0000 UTC m=+245.005227588" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.814719 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-audit-policies\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815129 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815279 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32b42b30-6881-4858-8b34-806c3307bcde-audit-dir\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815514 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crp5\" (UniqueName: \"kubernetes.io/projected/32b42b30-6881-4858-8b34-806c3307bcde-kube-api-access-9crp5\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815567 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815655 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815678 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815793 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815880 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815913 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.815962 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.816039 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.816094 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.816138 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.917419 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.917774 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.918025 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.918354 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.918546 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-audit-policies\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.918899 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.918659 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.920121 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32b42b30-6881-4858-8b34-806c3307bcde-audit-dir\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.919549 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32b42b30-6881-4858-8b34-806c3307bcde-audit-dir\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.920633 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crp5\" (UniqueName: \"kubernetes.io/projected/32b42b30-6881-4858-8b34-806c3307bcde-kube-api-access-9crp5\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.920821 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921054 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921225 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921714 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921756 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921781 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921935 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-audit-policies\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.921651 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.925002 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.925928 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.926115 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.926792 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.926929 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.929027 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.929501 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.929887 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.940389 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.940607 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32b42b30-6881-4858-8b34-806c3307bcde-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:05 crc kubenswrapper[4865]: I1205 05:57:05.940860 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crp5\" (UniqueName: \"kubernetes.io/projected/32b42b30-6881-4858-8b34-806c3307bcde-kube-api-access-9crp5\") pod \"oauth-openshift-86f4ddc759-vvtgp\" (UID: \"32b42b30-6881-4858-8b34-806c3307bcde\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.021791 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.148054 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.477104 4865 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.554581 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.604536 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.753994 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.756473 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.756516 4865 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="374aabfce78bfe0ffbc1417ac3b56335f117e39c8a581823fa115d74e622cde8" exitCode=137 Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.756553 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"374aabfce78bfe0ffbc1417ac3b56335f117e39c8a581823fa115d74e622cde8"} Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.756579 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d6b12f6f75d9422b3dbd729e9ef8205e968b78dab585fff577d0fb0257835b8"} Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.756597 4865 scope.go:117] "RemoveContainer" containerID="c4decfc2a1886b97bc70deec1e0391c2ec3f1963ccf9bd83d9893d6cf1459a5d" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.821488 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 05:57:06 crc kubenswrapper[4865]: I1205 05:57:06.828501 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 05:57:07 crc kubenswrapper[4865]: I1205 05:57:07.017652 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a82cae-057c-47d5-9703-eb48128e1bd9" path="/var/lib/kubelet/pods/87a82cae-057c-47d5-9703-eb48128e1bd9/volumes" Dec 05 05:57:07 crc kubenswrapper[4865]: I1205 05:57:07.286928 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 05:57:07 crc kubenswrapper[4865]: I1205 05:57:07.524064 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 05:57:07 crc kubenswrapper[4865]: I1205 05:57:07.770543 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 05:57:07 crc kubenswrapper[4865]: I1205 05:57:07.917808 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 05:57:08 crc kubenswrapper[4865]: I1205 05:57:08.042131 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 05:57:08 crc kubenswrapper[4865]: I1205 05:57:08.066093 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 05:57:08 crc kubenswrapper[4865]: I1205 05:57:08.135676 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 05:57:08 crc kubenswrapper[4865]: I1205 05:57:08.280974 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 05:57:08 crc kubenswrapper[4865]: I1205 05:57:08.706739 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 05:57:08 crc kubenswrapper[4865]: E1205 05:57:08.867800 4865 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 05 05:57:08 crc kubenswrapper[4865]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-vvtgp_openshift-authentication_32b42b30-6881-4858-8b34-806c3307bcde_0(e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-vvtgp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc" Netns:"/var/run/netns/cb90d70f-0391-489e-8760-8b708f8a746f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-vvtgp;K8S_POD_INFRA_CONTAINER_ID=e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc;K8S_POD_UID=32b42b30-6881-4858-8b34-806c3307bcde" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp/32b42b30-6881-4858-8b34-806c3307bcde]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-86f4ddc759-vvtgp in out of cluster comm: pod "oauth-openshift-86f4ddc759-vvtgp" not found Dec 05 05:57:08 crc kubenswrapper[4865]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 05:57:08 crc kubenswrapper[4865]: > Dec 05 05:57:08 crc kubenswrapper[4865]: E1205 05:57:08.868187 4865 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 05 05:57:08 crc kubenswrapper[4865]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-vvtgp_openshift-authentication_32b42b30-6881-4858-8b34-806c3307bcde_0(e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-vvtgp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc" Netns:"/var/run/netns/cb90d70f-0391-489e-8760-8b708f8a746f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-vvtgp;K8S_POD_INFRA_CONTAINER_ID=e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc;K8S_POD_UID=32b42b30-6881-4858-8b34-806c3307bcde" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp/32b42b30-6881-4858-8b34-806c3307bcde]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-86f4ddc759-vvtgp in out of cluster comm: pod "oauth-openshift-86f4ddc759-vvtgp" not found Dec 05 05:57:08 crc kubenswrapper[4865]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 05:57:08 crc kubenswrapper[4865]: > pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:08 crc kubenswrapper[4865]: E1205 05:57:08.868225 4865 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 05 05:57:08 crc kubenswrapper[4865]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-vvtgp_openshift-authentication_32b42b30-6881-4858-8b34-806c3307bcde_0(e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-vvtgp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc" Netns:"/var/run/netns/cb90d70f-0391-489e-8760-8b708f8a746f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-vvtgp;K8S_POD_INFRA_CONTAINER_ID=e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc;K8S_POD_UID=32b42b30-6881-4858-8b34-806c3307bcde" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp/32b42b30-6881-4858-8b34-806c3307bcde]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-86f4ddc759-vvtgp in out of cluster comm: pod "oauth-openshift-86f4ddc759-vvtgp" not found Dec 05 05:57:08 crc kubenswrapper[4865]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 05:57:08 crc kubenswrapper[4865]: > pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:08 crc kubenswrapper[4865]: E1205 05:57:08.868291 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-86f4ddc759-vvtgp_openshift-authentication(32b42b30-6881-4858-8b34-806c3307bcde)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-86f4ddc759-vvtgp_openshift-authentication(32b42b30-6881-4858-8b34-806c3307bcde)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-vvtgp_openshift-authentication_32b42b30-6881-4858-8b34-806c3307bcde_0(e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-vvtgp to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc\\\" Netns:\\\"/var/run/netns/cb90d70f-0391-489e-8760-8b708f8a746f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-vvtgp;K8S_POD_INFRA_CONTAINER_ID=e8c4928774c5b064203627aa73b1b3c44c2a059784f5011e8cebdb67133f50cc;K8S_POD_UID=32b42b30-6881-4858-8b34-806c3307bcde\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-vvtgp/32b42b30-6881-4858-8b34-806c3307bcde]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-86f4ddc759-vvtgp in out of cluster comm: pod \\\"oauth-openshift-86f4ddc759-vvtgp\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" podUID="32b42b30-6881-4858-8b34-806c3307bcde" Dec 05 05:57:08 crc kubenswrapper[4865]: I1205 05:57:08.946551 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 05:57:09 crc kubenswrapper[4865]: I1205 05:57:09.031286 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:57:09 crc kubenswrapper[4865]: I1205 05:57:09.036894 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 05:57:09 crc kubenswrapper[4865]: I1205 05:57:09.166969 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 05:57:09 crc kubenswrapper[4865]: I1205 05:57:09.379650 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 05:57:09 crc kubenswrapper[4865]: I1205 05:57:09.478537 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 05:57:09 crc kubenswrapper[4865]: I1205 05:57:09.828809 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 05:57:10 crc kubenswrapper[4865]: I1205 05:57:10.267198 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 05:57:10 crc kubenswrapper[4865]: I1205 05:57:10.377886 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 05:57:10 crc kubenswrapper[4865]: I1205 05:57:10.527954 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 05:57:10 crc kubenswrapper[4865]: I1205 05:57:10.869503 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 05:57:11 crc kubenswrapper[4865]: I1205 05:57:11.127793 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 05:57:11 crc kubenswrapper[4865]: I1205 05:57:11.224802 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:57:11 crc kubenswrapper[4865]: I1205 05:57:11.301376 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 05:57:11 crc kubenswrapper[4865]: I1205 05:57:11.314929 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 05:57:11 crc kubenswrapper[4865]: I1205 05:57:11.734221 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 05:57:11 crc kubenswrapper[4865]: I1205 05:57:11.753946 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 05:57:12 crc kubenswrapper[4865]: I1205 05:57:12.157323 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 05:57:12 crc kubenswrapper[4865]: I1205 05:57:12.535988 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 05:57:12 crc kubenswrapper[4865]: I1205 05:57:12.724183 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 05:57:12 crc kubenswrapper[4865]: I1205 05:57:12.785918 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 05:57:12 crc kubenswrapper[4865]: I1205 05:57:12.805157 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 05:57:13 crc kubenswrapper[4865]: I1205 05:57:13.179926 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 05:57:13 crc kubenswrapper[4865]: I1205 05:57:13.372211 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 05:57:13 crc kubenswrapper[4865]: I1205 05:57:13.384336 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 05:57:14 crc kubenswrapper[4865]: I1205 05:57:14.512229 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 05:57:14 crc kubenswrapper[4865]: I1205 05:57:14.675301 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 05:57:14 crc kubenswrapper[4865]: I1205 05:57:14.707962 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 05:57:14 crc kubenswrapper[4865]: I1205 05:57:14.954632 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 05:57:15 crc kubenswrapper[4865]: I1205 05:57:15.089490 4865 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 05:57:15 crc kubenswrapper[4865]: I1205 05:57:15.304034 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 05:57:15 crc kubenswrapper[4865]: I1205 05:57:15.878568 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:57:15 crc kubenswrapper[4865]: I1205 05:57:15.883245 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.238652 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.458178 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.487538 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.556695 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.651693 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.840188 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 05:57:16 crc kubenswrapper[4865]: I1205 05:57:16.995587 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 05:57:17 crc kubenswrapper[4865]: I1205 05:57:17.062605 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 05:57:17 crc kubenswrapper[4865]: I1205 05:57:17.776852 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 05:57:17 crc kubenswrapper[4865]: I1205 05:57:17.915688 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 05:57:17 crc kubenswrapper[4865]: I1205 05:57:17.945027 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 05:57:18 crc kubenswrapper[4865]: I1205 05:57:18.158940 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 05:57:18 crc kubenswrapper[4865]: I1205 05:57:18.160213 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 05:57:18 crc kubenswrapper[4865]: I1205 05:57:18.854342 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 05:57:19 crc kubenswrapper[4865]: I1205 05:57:19.109000 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 05:57:19 crc kubenswrapper[4865]: I1205 05:57:19.484617 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 05:57:20 crc kubenswrapper[4865]: I1205 05:57:20.156489 4865 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 05:57:20 crc kubenswrapper[4865]: I1205 05:57:20.156739 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d" gracePeriod=5 Dec 05 05:57:20 crc kubenswrapper[4865]: I1205 05:57:20.294648 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 05:57:20 crc kubenswrapper[4865]: I1205 05:57:20.365798 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 05:57:20 crc kubenswrapper[4865]: I1205 05:57:20.380678 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 05:57:20 crc kubenswrapper[4865]: I1205 05:57:20.888251 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.018122 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.018731 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.062027 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.351445 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-vvtgp"] Dec 05 05:57:21 crc kubenswrapper[4865]: W1205 05:57:21.359212 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b42b30_6881_4858_8b34_806c3307bcde.slice/crio-eb889e6b901e3f0f0bbbbb74a534f746c98f94a84c11817f34774b50203acf52 WatchSource:0}: Error finding container eb889e6b901e3f0f0bbbbb74a534f746c98f94a84c11817f34774b50203acf52: Status 404 returned error can't find the container with id eb889e6b901e3f0f0bbbbb74a534f746c98f94a84c11817f34774b50203acf52 Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.846443 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.860641 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" event={"ID":"32b42b30-6881-4858-8b34-806c3307bcde","Type":"ContainerStarted","Data":"f2855c3b545e8a9571588edfdd91d76aee6457086df14fa10a3f3e4786038ca7"} Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.860681 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" event={"ID":"32b42b30-6881-4858-8b34-806c3307bcde","Type":"ContainerStarted","Data":"eb889e6b901e3f0f0bbbbb74a534f746c98f94a84c11817f34774b50203acf52"} Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.861036 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:21 crc kubenswrapper[4865]: I1205 05:57:21.887305 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" podStartSLOduration=98.887288152 podStartE2EDuration="1m38.887288152s" podCreationTimestamp="2025-12-05 05:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:57:21.885350087 +0000 UTC m=+261.165361309" watchObservedRunningTime="2025-12-05 05:57:21.887288152 +0000 UTC m=+261.167299374" Dec 05 05:57:22 crc kubenswrapper[4865]: I1205 05:57:22.211979 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86f4ddc759-vvtgp" Dec 05 05:57:22 crc kubenswrapper[4865]: I1205 05:57:22.629100 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 05:57:22 crc kubenswrapper[4865]: I1205 05:57:22.796702 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.308793 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9"] Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.309011 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" podUID="4ff00812-1a0c-4bbc-8222-d7765505af6b" containerName="route-controller-manager" containerID="cri-o://9a3faeca9dc31aa6a02c4a07a9d9ae868dc05e51758895ac25b80a3ee2b6d5ca" gracePeriod=30 Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.330144 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.339505 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrb2j"] Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.339715 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" podUID="8cc2c35c-7bb7-4475-a318-0133139b9359" containerName="controller-manager" containerID="cri-o://ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786" gracePeriod=30 Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.474176 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.756724 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.843497 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.903370 4865 generic.go:334] "Generic (PLEG): container finished" podID="4ff00812-1a0c-4bbc-8222-d7765505af6b" containerID="9a3faeca9dc31aa6a02c4a07a9d9ae868dc05e51758895ac25b80a3ee2b6d5ca" exitCode=0 Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.903538 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" event={"ID":"4ff00812-1a0c-4bbc-8222-d7765505af6b","Type":"ContainerDied","Data":"9a3faeca9dc31aa6a02c4a07a9d9ae868dc05e51758895ac25b80a3ee2b6d5ca"} Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.903578 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" event={"ID":"4ff00812-1a0c-4bbc-8222-d7765505af6b","Type":"ContainerDied","Data":"c7314007fcf9afbdb17a0d3e04d7f10fe37c0bfc559874e550f9ee5b834f0ad4"} Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.903606 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7314007fcf9afbdb17a0d3e04d7f10fe37c0bfc559874e550f9ee5b834f0ad4" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.916278 4865 generic.go:334] "Generic (PLEG): container finished" podID="8cc2c35c-7bb7-4475-a318-0133139b9359" containerID="ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786" exitCode=0 Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.916490 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" event={"ID":"8cc2c35c-7bb7-4475-a318-0133139b9359","Type":"ContainerDied","Data":"ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786"} Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.916523 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" event={"ID":"8cc2c35c-7bb7-4475-a318-0133139b9359","Type":"ContainerDied","Data":"8a3eb92fc43cdedd2eae21b064b17ef9011708e3c915217dd10888e11a0912ed"} Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.916542 4865 scope.go:117] "RemoveContainer" containerID="ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.917816 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrb2j" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.919338 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.936435 4865 scope.go:117] "RemoveContainer" containerID="ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786" Dec 05 05:57:23 crc kubenswrapper[4865]: E1205 05:57:23.936815 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786\": container with ID starting with ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786 not found: ID does not exist" containerID="ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786" Dec 05 05:57:23 crc kubenswrapper[4865]: I1205 05:57:23.936870 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786"} err="failed to get container status \"ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786\": rpc error: code = NotFound desc = could not find container \"ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786\": container with ID starting with ecc943f06edb698b6a5a2ee1dc0e642dcb6e04c57a75b36a7bae7dd6ff188786 not found: ID does not exist" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.022527 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qbxk\" (UniqueName: \"kubernetes.io/projected/4ff00812-1a0c-4bbc-8222-d7765505af6b-kube-api-access-9qbxk\") pod \"4ff00812-1a0c-4bbc-8222-d7765505af6b\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.022812 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-client-ca\") pod \"4ff00812-1a0c-4bbc-8222-d7765505af6b\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.022958 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff00812-1a0c-4bbc-8222-d7765505af6b-serving-cert\") pod \"4ff00812-1a0c-4bbc-8222-d7765505af6b\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023064 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-config\") pod \"4ff00812-1a0c-4bbc-8222-d7765505af6b\" (UID: \"4ff00812-1a0c-4bbc-8222-d7765505af6b\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023189 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhl28\" (UniqueName: \"kubernetes.io/projected/8cc2c35c-7bb7-4475-a318-0133139b9359-kube-api-access-hhl28\") pod \"8cc2c35c-7bb7-4475-a318-0133139b9359\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023292 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-config\") pod \"8cc2c35c-7bb7-4475-a318-0133139b9359\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023422 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-proxy-ca-bundles\") pod \"8cc2c35c-7bb7-4475-a318-0133139b9359\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023485 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ff00812-1a0c-4bbc-8222-d7765505af6b" (UID: "4ff00812-1a0c-4bbc-8222-d7765505af6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023604 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cc2c35c-7bb7-4475-a318-0133139b9359-serving-cert\") pod \"8cc2c35c-7bb7-4475-a318-0133139b9359\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023721 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-client-ca\") pod \"8cc2c35c-7bb7-4475-a318-0133139b9359\" (UID: \"8cc2c35c-7bb7-4475-a318-0133139b9359\") " Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.023891 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-config" (OuterVolumeSpecName: "config") pod "4ff00812-1a0c-4bbc-8222-d7765505af6b" (UID: "4ff00812-1a0c-4bbc-8222-d7765505af6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.024316 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.024442 4865 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ff00812-1a0c-4bbc-8222-d7765505af6b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.027877 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-client-ca" (OuterVolumeSpecName: "client-ca") pod "8cc2c35c-7bb7-4475-a318-0133139b9359" (UID: "8cc2c35c-7bb7-4475-a318-0133139b9359"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.028157 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-config" (OuterVolumeSpecName: "config") pod "8cc2c35c-7bb7-4475-a318-0133139b9359" (UID: "8cc2c35c-7bb7-4475-a318-0133139b9359"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.028334 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8cc2c35c-7bb7-4475-a318-0133139b9359" (UID: "8cc2c35c-7bb7-4475-a318-0133139b9359"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.030657 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc2c35c-7bb7-4475-a318-0133139b9359-kube-api-access-hhl28" (OuterVolumeSpecName: "kube-api-access-hhl28") pod "8cc2c35c-7bb7-4475-a318-0133139b9359" (UID: "8cc2c35c-7bb7-4475-a318-0133139b9359"). InnerVolumeSpecName "kube-api-access-hhl28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.030721 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff00812-1a0c-4bbc-8222-d7765505af6b-kube-api-access-9qbxk" (OuterVolumeSpecName: "kube-api-access-9qbxk") pod "4ff00812-1a0c-4bbc-8222-d7765505af6b" (UID: "4ff00812-1a0c-4bbc-8222-d7765505af6b"). InnerVolumeSpecName "kube-api-access-9qbxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.030941 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff00812-1a0c-4bbc-8222-d7765505af6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ff00812-1a0c-4bbc-8222-d7765505af6b" (UID: "4ff00812-1a0c-4bbc-8222-d7765505af6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.036284 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc2c35c-7bb7-4475-a318-0133139b9359-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cc2c35c-7bb7-4475-a318-0133139b9359" (UID: "8cc2c35c-7bb7-4475-a318-0133139b9359"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126084 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff00812-1a0c-4bbc-8222-d7765505af6b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126127 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhl28\" (UniqueName: \"kubernetes.io/projected/8cc2c35c-7bb7-4475-a318-0133139b9359-kube-api-access-hhl28\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126143 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126152 4865 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126162 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cc2c35c-7bb7-4475-a318-0133139b9359-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126170 4865 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2c35c-7bb7-4475-a318-0133139b9359-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.126178 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qbxk\" (UniqueName: \"kubernetes.io/projected/4ff00812-1a0c-4bbc-8222-d7765505af6b-kube-api-access-9qbxk\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.250399 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrb2j"] Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.255449 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrb2j"] Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.273094 4865 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.346108 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.614762 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.687152 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.875364 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.922366 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9" Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.954839 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9"] Dec 05 05:57:24 crc kubenswrapper[4865]: I1205 05:57:24.956559 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5gw9"] Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.023390 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff00812-1a0c-4bbc-8222-d7765505af6b" path="/var/lib/kubelet/pods/4ff00812-1a0c-4bbc-8222-d7765505af6b/volumes" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.024261 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc2c35c-7bb7-4475-a318-0133139b9359" path="/var/lib/kubelet/pods/8cc2c35c-7bb7-4475-a318-0133139b9359/volumes" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302007 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bc9444ff5-5759d"] Dec 05 05:57:25 crc kubenswrapper[4865]: E1205 05:57:25.302289 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc2c35c-7bb7-4475-a318-0133139b9359" containerName="controller-manager" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302302 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc2c35c-7bb7-4475-a318-0133139b9359" containerName="controller-manager" Dec 05 05:57:25 crc kubenswrapper[4865]: E1205 05:57:25.302315 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff00812-1a0c-4bbc-8222-d7765505af6b" containerName="route-controller-manager" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302321 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff00812-1a0c-4bbc-8222-d7765505af6b" containerName="route-controller-manager" Dec 05 05:57:25 crc kubenswrapper[4865]: E1205 05:57:25.302331 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302337 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302435 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302446 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc2c35c-7bb7-4475-a318-0133139b9359" containerName="controller-manager" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302456 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff00812-1a0c-4bbc-8222-d7765505af6b" containerName="route-controller-manager" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.302857 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.305774 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.306092 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.306221 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh"] Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.306336 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.306582 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.307169 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.307817 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.308691 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.309507 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.310026 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.310525 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.311359 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.311704 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.319391 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.322127 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.331883 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bc9444ff5-5759d"] Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.336509 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh"] Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447003 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16331e67-7ae2-4c7d-9d74-8603551671cf-serving-cert\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447074 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-proxy-ca-bundles\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447102 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-config\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447134 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcgz\" (UniqueName: \"kubernetes.io/projected/1c166827-bbdd-4c8a-bb22-82214666b20b-kube-api-access-6qcgz\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447192 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c166827-bbdd-4c8a-bb22-82214666b20b-client-ca\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447213 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c166827-bbdd-4c8a-bb22-82214666b20b-serving-cert\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447246 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcn2k\" (UniqueName: \"kubernetes.io/projected/16331e67-7ae2-4c7d-9d74-8603551671cf-kube-api-access-lcn2k\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447264 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-client-ca\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.447479 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c166827-bbdd-4c8a-bb22-82214666b20b-config\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548248 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16331e67-7ae2-4c7d-9d74-8603551671cf-serving-cert\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548305 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-proxy-ca-bundles\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548323 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-config\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548366 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcgz\" (UniqueName: \"kubernetes.io/projected/1c166827-bbdd-4c8a-bb22-82214666b20b-kube-api-access-6qcgz\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548387 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c166827-bbdd-4c8a-bb22-82214666b20b-client-ca\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548415 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c166827-bbdd-4c8a-bb22-82214666b20b-serving-cert\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548452 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcn2k\" (UniqueName: \"kubernetes.io/projected/16331e67-7ae2-4c7d-9d74-8603551671cf-kube-api-access-lcn2k\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548475 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-client-ca\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.548520 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c166827-bbdd-4c8a-bb22-82214666b20b-config\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.549976 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-client-ca\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.550008 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c166827-bbdd-4c8a-bb22-82214666b20b-config\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.549983 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c166827-bbdd-4c8a-bb22-82214666b20b-client-ca\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.550184 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-config\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.551735 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-proxy-ca-bundles\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.554708 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c166827-bbdd-4c8a-bb22-82214666b20b-serving-cert\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.561814 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16331e67-7ae2-4c7d-9d74-8603551671cf-serving-cert\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.570040 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcn2k\" (UniqueName: \"kubernetes.io/projected/16331e67-7ae2-4c7d-9d74-8603551671cf-kube-api-access-lcn2k\") pod \"controller-manager-6bc9444ff5-5759d\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.579519 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcgz\" (UniqueName: \"kubernetes.io/projected/1c166827-bbdd-4c8a-bb22-82214666b20b-kube-api-access-6qcgz\") pod \"route-controller-manager-f4c9dbfbf-wcjfh\" (UID: \"1c166827-bbdd-4c8a-bb22-82214666b20b\") " pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.620353 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.628837 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.722863 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.722929 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.853439 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.853859 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.853898 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.853980 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854046 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854071 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854121 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854178 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854259 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854487 4865 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854503 4865 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854512 4865 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.854522 4865 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.861847 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.930818 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.930879 4865 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d" exitCode=137 Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.930922 4865 scope.go:117] "RemoveContainer" containerID="b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.931025 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.959809 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bc9444ff5-5759d"] Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.963389 4865 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.980852 4865 scope.go:117] "RemoveContainer" containerID="b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d" Dec 05 05:57:25 crc kubenswrapper[4865]: E1205 05:57:25.981979 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d\": container with ID starting with b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d not found: ID does not exist" containerID="b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d" Dec 05 05:57:25 crc kubenswrapper[4865]: I1205 05:57:25.982095 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d"} err="failed to get container status \"b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d\": rpc error: code = NotFound desc = could not find container \"b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d\": container with ID starting with b8138f65307669cdc4ff02f4694ffd6820f91b6bb64b80c44de096bbc6f5fa2d not found: ID does not exist" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.005816 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh"] Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.122926 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.749026 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.938619 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" event={"ID":"16331e67-7ae2-4c7d-9d74-8603551671cf","Type":"ContainerStarted","Data":"f7976a9fbdde913f1f054c111cdb6d16c497deab6019b790a38c1b405559bf38"} Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.938665 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" event={"ID":"16331e67-7ae2-4c7d-9d74-8603551671cf","Type":"ContainerStarted","Data":"ebd4e0a671d0d9f53a8a13fc524a96aca51ad76635487553da9356e7ad8d775f"} Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.940094 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.942410 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" event={"ID":"1c166827-bbdd-4c8a-bb22-82214666b20b","Type":"ContainerStarted","Data":"8b7d00c8bcef902f16c74a7cd6602096b0a57ef797859750067bd6d2b20e82a7"} Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.942448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" event={"ID":"1c166827-bbdd-4c8a-bb22-82214666b20b","Type":"ContainerStarted","Data":"4732d583b66a464113b21ee4821af4b6f4b75ae1e8c95b3eeccb4266ff600383"} Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.943197 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.946842 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.955118 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" Dec 05 05:57:26 crc kubenswrapper[4865]: I1205 05:57:26.969087 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" podStartSLOduration=3.96906272 podStartE2EDuration="3.96906272s" podCreationTimestamp="2025-12-05 05:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:57:26.964440489 +0000 UTC m=+266.244451721" watchObservedRunningTime="2025-12-05 05:57:26.96906272 +0000 UTC m=+266.249073942" Dec 05 05:57:27 crc kubenswrapper[4865]: I1205 05:57:27.031899 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f4c9dbfbf-wcjfh" podStartSLOduration=4.031871604 podStartE2EDuration="4.031871604s" podCreationTimestamp="2025-12-05 05:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:57:27.029813356 +0000 UTC m=+266.309824578" watchObservedRunningTime="2025-12-05 05:57:27.031871604 +0000 UTC m=+266.311882826" Dec 05 05:57:27 crc kubenswrapper[4865]: I1205 05:57:27.040588 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 05:57:29 crc kubenswrapper[4865]: I1205 05:57:29.336534 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 05:57:30 crc kubenswrapper[4865]: I1205 05:57:30.247784 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 05:57:43 crc kubenswrapper[4865]: I1205 05:57:43.954236 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jvq84"] Dec 05 05:57:43 crc kubenswrapper[4865]: I1205 05:57:43.956191 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:43 crc kubenswrapper[4865]: I1205 05:57:43.976247 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jvq84"] Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100178 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cfdd03d-7078-4604-b34e-34a562d33fa3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100235 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cfdd03d-7078-4604-b34e-34a562d33fa3-trusted-ca\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100283 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-bound-sa-token\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100300 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cfdd03d-7078-4604-b34e-34a562d33fa3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100494 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-registry-tls\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100589 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfg56\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-kube-api-access-jfg56\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100703 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cfdd03d-7078-4604-b34e-34a562d33fa3-registry-certificates\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.100756 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.120957 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.202770 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cfdd03d-7078-4604-b34e-34a562d33fa3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.202872 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-registry-tls\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.202931 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfg56\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-kube-api-access-jfg56\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.202990 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cfdd03d-7078-4604-b34e-34a562d33fa3-registry-certificates\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.203036 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cfdd03d-7078-4604-b34e-34a562d33fa3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.203071 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cfdd03d-7078-4604-b34e-34a562d33fa3-trusted-ca\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.203104 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-bound-sa-token\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.203325 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cfdd03d-7078-4604-b34e-34a562d33fa3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.204269 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cfdd03d-7078-4604-b34e-34a562d33fa3-registry-certificates\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.204864 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cfdd03d-7078-4604-b34e-34a562d33fa3-trusted-ca\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.211289 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-registry-tls\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.211289 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cfdd03d-7078-4604-b34e-34a562d33fa3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.221502 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-bound-sa-token\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.221878 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfg56\" (UniqueName: \"kubernetes.io/projected/2cfdd03d-7078-4604-b34e-34a562d33fa3-kube-api-access-jfg56\") pod \"image-registry-66df7c8f76-jvq84\" (UID: \"2cfdd03d-7078-4604-b34e-34a562d33fa3\") " pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.272634 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:44 crc kubenswrapper[4865]: I1205 05:57:44.765549 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jvq84"] Dec 05 05:57:44 crc kubenswrapper[4865]: W1205 05:57:44.768403 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cfdd03d_7078_4604_b34e_34a562d33fa3.slice/crio-c1f2d82e6ff9fc409e022569f592d6ca3d578c1916171f02def17f37b09cefe5 WatchSource:0}: Error finding container c1f2d82e6ff9fc409e022569f592d6ca3d578c1916171f02def17f37b09cefe5: Status 404 returned error can't find the container with id c1f2d82e6ff9fc409e022569f592d6ca3d578c1916171f02def17f37b09cefe5 Dec 05 05:57:45 crc kubenswrapper[4865]: I1205 05:57:45.048560 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" event={"ID":"2cfdd03d-7078-4604-b34e-34a562d33fa3","Type":"ContainerStarted","Data":"10beb933a908e6ec427fa1382180a28497e96fb22588940838a040ec4f7e7620"} Dec 05 05:57:45 crc kubenswrapper[4865]: I1205 05:57:45.048600 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" event={"ID":"2cfdd03d-7078-4604-b34e-34a562d33fa3","Type":"ContainerStarted","Data":"c1f2d82e6ff9fc409e022569f592d6ca3d578c1916171f02def17f37b09cefe5"} Dec 05 05:57:45 crc kubenswrapper[4865]: I1205 05:57:45.049628 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:57:45 crc kubenswrapper[4865]: I1205 05:57:45.072737 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" podStartSLOduration=2.072719549 podStartE2EDuration="2.072719549s" podCreationTimestamp="2025-12-05 05:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:57:45.063700865 +0000 UTC m=+284.343712097" watchObservedRunningTime="2025-12-05 05:57:45.072719549 +0000 UTC m=+284.352730771" Dec 05 05:57:50 crc kubenswrapper[4865]: I1205 05:57:50.630345 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ccqrr"] Dec 05 05:57:50 crc kubenswrapper[4865]: I1205 05:57:50.630954 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ccqrr" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="registry-server" containerID="cri-o://c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab" gracePeriod=2 Dec 05 05:57:50 crc kubenswrapper[4865]: I1205 05:57:50.829370 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zhc2"] Dec 05 05:57:50 crc kubenswrapper[4865]: I1205 05:57:50.829991 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5zhc2" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="registry-server" containerID="cri-o://4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48" gracePeriod=2 Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.090014 4865 generic.go:334] "Generic (PLEG): container finished" podID="582e42c0-b2d0-4b24-900e-1316a155c471" containerID="c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab" exitCode=0 Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.090101 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerDied","Data":"c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab"} Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.092987 4865 generic.go:334] "Generic (PLEG): container finished" podID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerID="4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48" exitCode=0 Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.093021 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zhc2" event={"ID":"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e","Type":"ContainerDied","Data":"4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48"} Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.232810 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab is running failed: container process not found" containerID="c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.233570 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab is running failed: container process not found" containerID="c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.233978 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab is running failed: container process not found" containerID="c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.234024 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-ccqrr" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="registry-server" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.333028 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.445879 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psql8\" (UniqueName: \"kubernetes.io/projected/582e42c0-b2d0-4b24-900e-1316a155c471-kube-api-access-psql8\") pod \"582e42c0-b2d0-4b24-900e-1316a155c471\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.446026 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-catalog-content\") pod \"582e42c0-b2d0-4b24-900e-1316a155c471\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.446062 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-utilities\") pod \"582e42c0-b2d0-4b24-900e-1316a155c471\" (UID: \"582e42c0-b2d0-4b24-900e-1316a155c471\") " Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.447016 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-utilities" (OuterVolumeSpecName: "utilities") pod "582e42c0-b2d0-4b24-900e-1316a155c471" (UID: "582e42c0-b2d0-4b24-900e-1316a155c471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.455036 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582e42c0-b2d0-4b24-900e-1316a155c471-kube-api-access-psql8" (OuterVolumeSpecName: "kube-api-access-psql8") pod "582e42c0-b2d0-4b24-900e-1316a155c471" (UID: "582e42c0-b2d0-4b24-900e-1316a155c471"). InnerVolumeSpecName "kube-api-access-psql8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.496923 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "582e42c0-b2d0-4b24-900e-1316a155c471" (UID: "582e42c0-b2d0-4b24-900e-1316a155c471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.514260 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48 is running failed: container process not found" containerID="4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.514922 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48 is running failed: container process not found" containerID="4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.515362 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48 is running failed: container process not found" containerID="4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:57:52 crc kubenswrapper[4865]: E1205 05:57:52.515450 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-5zhc2" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="registry-server" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.548101 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.548134 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582e42c0-b2d0-4b24-900e-1316a155c471-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.548145 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psql8\" (UniqueName: \"kubernetes.io/projected/582e42c0-b2d0-4b24-900e-1316a155c471-kube-api-access-psql8\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.814773 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.953446 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-utilities\") pod \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.953534 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42qmf\" (UniqueName: \"kubernetes.io/projected/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-kube-api-access-42qmf\") pod \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.953574 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-catalog-content\") pod \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\" (UID: \"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e\") " Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.954177 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-utilities" (OuterVolumeSpecName: "utilities") pod "a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" (UID: "a13b3fdc-c602-48f5-bc10-e2e30df8cc0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:52 crc kubenswrapper[4865]: I1205 05:57:52.956860 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-kube-api-access-42qmf" (OuterVolumeSpecName: "kube-api-access-42qmf") pod "a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" (UID: "a13b3fdc-c602-48f5-bc10-e2e30df8cc0e"). InnerVolumeSpecName "kube-api-access-42qmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.004857 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" (UID: "a13b3fdc-c602-48f5-bc10-e2e30df8cc0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.029995 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mds9l"] Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.030245 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mds9l" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="registry-server" containerID="cri-o://56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b" gracePeriod=2 Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.055357 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.055682 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42qmf\" (UniqueName: \"kubernetes.io/projected/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-kube-api-access-42qmf\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.055697 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.101342 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zhc2" event={"ID":"a13b3fdc-c602-48f5-bc10-e2e30df8cc0e","Type":"ContainerDied","Data":"32595a5c9c95753de79666b4c4d850efdb66451a5c28f95170105b4d0b53b388"} Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.101405 4865 scope.go:117] "RemoveContainer" containerID="4b86da1bf649e6e44e0757a6877f1535c13c51f539c162e0f607433cec81cd48" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.101535 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zhc2" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.105114 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccqrr" event={"ID":"582e42c0-b2d0-4b24-900e-1316a155c471","Type":"ContainerDied","Data":"34eac7efd1c866225c8af0dd70f2041b8e7475bb73714da5dac8f3a1fcacf567"} Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.105141 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccqrr" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.119035 4865 scope.go:117] "RemoveContainer" containerID="6efad480a2917cf54bb33cc53a81fca29bfc02b19668c477c0f95e27251a0be9" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.131629 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ccqrr"] Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.135101 4865 scope.go:117] "RemoveContainer" containerID="d458832c57ba86e3d002b96d902ecceba3b53620221417213af4d9b88d7bd7bb" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.143783 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ccqrr"] Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.150046 4865 scope.go:117] "RemoveContainer" containerID="c672a0092c43b18fe726b5d154341dc3fd84c70d15a16eac437c9c01f7bce4ab" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.163394 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zhc2"] Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.166736 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5zhc2"] Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.169355 4865 scope.go:117] "RemoveContainer" containerID="1eca415c9add728d8c06d6db223e2020eb15cc5dce769a80666b4389c04354ec" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.191220 4865 scope.go:117] "RemoveContainer" containerID="25bd2317d0ad9c82e708aead223c59477c04dca7149e90b3e4a14f7287917644" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.228622 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rw9pr"] Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.228860 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rw9pr" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="registry-server" containerID="cri-o://b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179" gracePeriod=2 Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.458149 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.566602 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-utilities\") pod \"bbe8803a-815d-4318-bfaa-1949755ed910\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.566696 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6294\" (UniqueName: \"kubernetes.io/projected/bbe8803a-815d-4318-bfaa-1949755ed910-kube-api-access-b6294\") pod \"bbe8803a-815d-4318-bfaa-1949755ed910\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.566967 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-catalog-content\") pod \"bbe8803a-815d-4318-bfaa-1949755ed910\" (UID: \"bbe8803a-815d-4318-bfaa-1949755ed910\") " Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.567838 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-utilities" (OuterVolumeSpecName: "utilities") pod "bbe8803a-815d-4318-bfaa-1949755ed910" (UID: "bbe8803a-815d-4318-bfaa-1949755ed910"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.568291 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.597872 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe8803a-815d-4318-bfaa-1949755ed910-kube-api-access-b6294" (OuterVolumeSpecName: "kube-api-access-b6294") pod "bbe8803a-815d-4318-bfaa-1949755ed910" (UID: "bbe8803a-815d-4318-bfaa-1949755ed910"). InnerVolumeSpecName "kube-api-access-b6294". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.620152 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbe8803a-815d-4318-bfaa-1949755ed910" (UID: "bbe8803a-815d-4318-bfaa-1949755ed910"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.649956 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.670021 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6294\" (UniqueName: \"kubernetes.io/projected/bbe8803a-815d-4318-bfaa-1949755ed910-kube-api-access-b6294\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.670056 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe8803a-815d-4318-bfaa-1949755ed910-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.770533 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-utilities\") pod \"748082c2-70ae-4b67-9c21-ff6f32030822\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.770591 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j6c8\" (UniqueName: \"kubernetes.io/projected/748082c2-70ae-4b67-9c21-ff6f32030822-kube-api-access-4j6c8\") pod \"748082c2-70ae-4b67-9c21-ff6f32030822\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.770679 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-catalog-content\") pod \"748082c2-70ae-4b67-9c21-ff6f32030822\" (UID: \"748082c2-70ae-4b67-9c21-ff6f32030822\") " Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.772500 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-utilities" (OuterVolumeSpecName: "utilities") pod "748082c2-70ae-4b67-9c21-ff6f32030822" (UID: "748082c2-70ae-4b67-9c21-ff6f32030822"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.774533 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/748082c2-70ae-4b67-9c21-ff6f32030822-kube-api-access-4j6c8" (OuterVolumeSpecName: "kube-api-access-4j6c8") pod "748082c2-70ae-4b67-9c21-ff6f32030822" (UID: "748082c2-70ae-4b67-9c21-ff6f32030822"). InnerVolumeSpecName "kube-api-access-4j6c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.871975 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j6c8\" (UniqueName: \"kubernetes.io/projected/748082c2-70ae-4b67-9c21-ff6f32030822-kube-api-access-4j6c8\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.872012 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.890503 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "748082c2-70ae-4b67-9c21-ff6f32030822" (UID: "748082c2-70ae-4b67-9c21-ff6f32030822"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:57:53 crc kubenswrapper[4865]: I1205 05:57:53.973177 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/748082c2-70ae-4b67-9c21-ff6f32030822-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.114346 4865 generic.go:334] "Generic (PLEG): container finished" podID="bbe8803a-815d-4318-bfaa-1949755ed910" containerID="56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b" exitCode=0 Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.114431 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mds9l" event={"ID":"bbe8803a-815d-4318-bfaa-1949755ed910","Type":"ContainerDied","Data":"56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b"} Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.114500 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mds9l" event={"ID":"bbe8803a-815d-4318-bfaa-1949755ed910","Type":"ContainerDied","Data":"44e296713a12c2a46acb991a34d0d5b61ce59a03fccb07b03c4e6c65a4c53454"} Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.114521 4865 scope.go:117] "RemoveContainer" containerID="56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.114451 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mds9l" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.119805 4865 generic.go:334] "Generic (PLEG): container finished" podID="748082c2-70ae-4b67-9c21-ff6f32030822" containerID="b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179" exitCode=0 Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.119867 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerDied","Data":"b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179"} Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.119888 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw9pr" event={"ID":"748082c2-70ae-4b67-9c21-ff6f32030822","Type":"ContainerDied","Data":"0a513d1494eb354e5c3747503c4cc292fa94e445c4f54c329cd0890da2b8e59a"} Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.119964 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw9pr" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.134736 4865 scope.go:117] "RemoveContainer" containerID="9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.158739 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mds9l"] Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.163958 4865 scope.go:117] "RemoveContainer" containerID="971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.172273 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mds9l"] Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.178270 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rw9pr"] Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.182900 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rw9pr"] Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.191276 4865 scope.go:117] "RemoveContainer" containerID="56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b" Dec 05 05:57:54 crc kubenswrapper[4865]: E1205 05:57:54.192716 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b\": container with ID starting with 56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b not found: ID does not exist" containerID="56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.192746 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b"} err="failed to get container status \"56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b\": rpc error: code = NotFound desc = could not find container \"56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b\": container with ID starting with 56a9f4d3f3183605a2d0a0891af2adc8d377769b40ea5cd4f0145972aaaf632b not found: ID does not exist" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.192769 4865 scope.go:117] "RemoveContainer" containerID="9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27" Dec 05 05:57:54 crc kubenswrapper[4865]: E1205 05:57:54.193167 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27\": container with ID starting with 9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27 not found: ID does not exist" containerID="9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.193211 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27"} err="failed to get container status \"9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27\": rpc error: code = NotFound desc = could not find container \"9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27\": container with ID starting with 9c9577a8383f9cb872fc992fed8dedd51dc286a2bbdf84e67edc8f1ee2ab5e27 not found: ID does not exist" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.193242 4865 scope.go:117] "RemoveContainer" containerID="971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933" Dec 05 05:57:54 crc kubenswrapper[4865]: E1205 05:57:54.193649 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933\": container with ID starting with 971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933 not found: ID does not exist" containerID="971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.193676 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933"} err="failed to get container status \"971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933\": rpc error: code = NotFound desc = could not find container \"971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933\": container with ID starting with 971d99082909a62fe82a7d323eef29d816b804c318fe36a3817fcdb675ba4933 not found: ID does not exist" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.193692 4865 scope.go:117] "RemoveContainer" containerID="b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.221043 4865 scope.go:117] "RemoveContainer" containerID="c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.242858 4865 scope.go:117] "RemoveContainer" containerID="595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.263143 4865 scope.go:117] "RemoveContainer" containerID="b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179" Dec 05 05:57:54 crc kubenswrapper[4865]: E1205 05:57:54.263733 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179\": container with ID starting with b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179 not found: ID does not exist" containerID="b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.263789 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179"} err="failed to get container status \"b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179\": rpc error: code = NotFound desc = could not find container \"b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179\": container with ID starting with b1a62f068f425148ad02449b42b2dd525b10167f7e56712506bd2861dc6c1179 not found: ID does not exist" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.263902 4865 scope.go:117] "RemoveContainer" containerID="c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c" Dec 05 05:57:54 crc kubenswrapper[4865]: E1205 05:57:54.264292 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c\": container with ID starting with c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c not found: ID does not exist" containerID="c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.264318 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c"} err="failed to get container status \"c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c\": rpc error: code = NotFound desc = could not find container \"c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c\": container with ID starting with c73a936c6387a1dbf28742e3d992fc433dc08a0b072cde9772f3b2c1edef410c not found: ID does not exist" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.264333 4865 scope.go:117] "RemoveContainer" containerID="595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055" Dec 05 05:57:54 crc kubenswrapper[4865]: E1205 05:57:54.264537 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055\": container with ID starting with 595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055 not found: ID does not exist" containerID="595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055" Dec 05 05:57:54 crc kubenswrapper[4865]: I1205 05:57:54.264557 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055"} err="failed to get container status \"595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055\": rpc error: code = NotFound desc = could not find container \"595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055\": container with ID starting with 595bebb32aacb73c9cd49cb839653596feca624c51476c863f2aea315441e055 not found: ID does not exist" Dec 05 05:57:55 crc kubenswrapper[4865]: I1205 05:57:55.014467 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" path="/var/lib/kubelet/pods/582e42c0-b2d0-4b24-900e-1316a155c471/volumes" Dec 05 05:57:55 crc kubenswrapper[4865]: I1205 05:57:55.015636 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" path="/var/lib/kubelet/pods/748082c2-70ae-4b67-9c21-ff6f32030822/volumes" Dec 05 05:57:55 crc kubenswrapper[4865]: I1205 05:57:55.016447 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" path="/var/lib/kubelet/pods/a13b3fdc-c602-48f5-bc10-e2e30df8cc0e/volumes" Dec 05 05:57:55 crc kubenswrapper[4865]: I1205 05:57:55.017724 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" path="/var/lib/kubelet/pods/bbe8803a-815d-4318-bfaa-1949755ed910/volumes" Dec 05 05:58:00 crc kubenswrapper[4865]: I1205 05:58:00.840994 4865 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 05 05:58:04 crc kubenswrapper[4865]: I1205 05:58:04.283251 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jvq84" Dec 05 05:58:04 crc kubenswrapper[4865]: I1205 05:58:04.369708 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fth8"] Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.424604 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" podUID="5dbbc361-a522-4548-a14e-bdd061c7bc4b" containerName="registry" containerID="cri-o://4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e" gracePeriod=30 Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.808394 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912354 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-tls\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912398 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-trusted-ca\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912455 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqw68\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-kube-api-access-qqw68\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912492 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dbbc361-a522-4548-a14e-bdd061c7bc4b-ca-trust-extracted\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912515 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-bound-sa-token\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912544 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-certificates\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912691 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.912731 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dbbc361-a522-4548-a14e-bdd061c7bc4b-installation-pull-secrets\") pod \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\" (UID: \"5dbbc361-a522-4548-a14e-bdd061c7bc4b\") " Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.913559 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.913698 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.919725 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.922174 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbbc361-a522-4548-a14e-bdd061c7bc4b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.927874 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-kube-api-access-qqw68" (OuterVolumeSpecName: "kube-api-access-qqw68") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "kube-api-access-qqw68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.928327 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbbc361-a522-4548-a14e-bdd061c7bc4b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.928564 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:29 crc kubenswrapper[4865]: I1205 05:58:29.940414 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5dbbc361-a522-4548-a14e-bdd061c7bc4b" (UID: "5dbbc361-a522-4548-a14e-bdd061c7bc4b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014501 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014541 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqw68\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-kube-api-access-qqw68\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014552 4865 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dbbc361-a522-4548-a14e-bdd061c7bc4b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014578 4865 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014588 4865 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014598 4865 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dbbc361-a522-4548-a14e-bdd061c7bc4b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.014607 4865 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dbbc361-a522-4548-a14e-bdd061c7bc4b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.313732 4865 generic.go:334] "Generic (PLEG): container finished" podID="5dbbc361-a522-4548-a14e-bdd061c7bc4b" containerID="4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e" exitCode=0 Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.313783 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" event={"ID":"5dbbc361-a522-4548-a14e-bdd061c7bc4b","Type":"ContainerDied","Data":"4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e"} Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.313799 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.313813 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fth8" event={"ID":"5dbbc361-a522-4548-a14e-bdd061c7bc4b","Type":"ContainerDied","Data":"af6bc3c359d7b5decbce624b71a763a5b4f6d7bc2a0c68d61a15813fcd1ded51"} Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.313859 4865 scope.go:117] "RemoveContainer" containerID="4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.341354 4865 scope.go:117] "RemoveContainer" containerID="4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e" Dec 05 05:58:30 crc kubenswrapper[4865]: E1205 05:58:30.342402 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e\": container with ID starting with 4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e not found: ID does not exist" containerID="4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.342456 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e"} err="failed to get container status \"4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e\": rpc error: code = NotFound desc = could not find container \"4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e\": container with ID starting with 4e3c74d39fc21ac0c8c699f3a5e5b8fb87858e8ffed6674964b1ccdf9035438e not found: ID does not exist" Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.358708 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fth8"] Dec 05 05:58:30 crc kubenswrapper[4865]: I1205 05:58:30.363058 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fth8"] Dec 05 05:58:31 crc kubenswrapper[4865]: I1205 05:58:31.016108 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbbc361-a522-4548-a14e-bdd061c7bc4b" path="/var/lib/kubelet/pods/5dbbc361-a522-4548-a14e-bdd061c7bc4b/volumes" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.117477 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bc9444ff5-5759d"] Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.118008 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" podUID="16331e67-7ae2-4c7d-9d74-8603551671cf" containerName="controller-manager" containerID="cri-o://f7976a9fbdde913f1f054c111cdb6d16c497deab6019b790a38c1b405559bf38" gracePeriod=30 Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.360711 4865 generic.go:334] "Generic (PLEG): container finished" podID="16331e67-7ae2-4c7d-9d74-8603551671cf" containerID="f7976a9fbdde913f1f054c111cdb6d16c497deab6019b790a38c1b405559bf38" exitCode=0 Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.360786 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" event={"ID":"16331e67-7ae2-4c7d-9d74-8603551671cf","Type":"ContainerDied","Data":"f7976a9fbdde913f1f054c111cdb6d16c497deab6019b790a38c1b405559bf38"} Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.542955 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.703467 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcn2k\" (UniqueName: \"kubernetes.io/projected/16331e67-7ae2-4c7d-9d74-8603551671cf-kube-api-access-lcn2k\") pod \"16331e67-7ae2-4c7d-9d74-8603551671cf\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.704021 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-client-ca\") pod \"16331e67-7ae2-4c7d-9d74-8603551671cf\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.704070 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-config\") pod \"16331e67-7ae2-4c7d-9d74-8603551671cf\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.704171 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16331e67-7ae2-4c7d-9d74-8603551671cf-serving-cert\") pod \"16331e67-7ae2-4c7d-9d74-8603551671cf\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.704204 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-proxy-ca-bundles\") pod \"16331e67-7ae2-4c7d-9d74-8603551671cf\" (UID: \"16331e67-7ae2-4c7d-9d74-8603551671cf\") " Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.705018 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-client-ca" (OuterVolumeSpecName: "client-ca") pod "16331e67-7ae2-4c7d-9d74-8603551671cf" (UID: "16331e67-7ae2-4c7d-9d74-8603551671cf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.705131 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-config" (OuterVolumeSpecName: "config") pod "16331e67-7ae2-4c7d-9d74-8603551671cf" (UID: "16331e67-7ae2-4c7d-9d74-8603551671cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.705278 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16331e67-7ae2-4c7d-9d74-8603551671cf" (UID: "16331e67-7ae2-4c7d-9d74-8603551671cf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.705675 4865 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.705718 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-config\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.705741 4865 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16331e67-7ae2-4c7d-9d74-8603551671cf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.720978 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16331e67-7ae2-4c7d-9d74-8603551671cf-kube-api-access-lcn2k" (OuterVolumeSpecName: "kube-api-access-lcn2k") pod "16331e67-7ae2-4c7d-9d74-8603551671cf" (UID: "16331e67-7ae2-4c7d-9d74-8603551671cf"). InnerVolumeSpecName "kube-api-access-lcn2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.722435 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16331e67-7ae2-4c7d-9d74-8603551671cf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16331e67-7ae2-4c7d-9d74-8603551671cf" (UID: "16331e67-7ae2-4c7d-9d74-8603551671cf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.807384 4865 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16331e67-7ae2-4c7d-9d74-8603551671cf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:36 crc kubenswrapper[4865]: I1205 05:58:36.807434 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcn2k\" (UniqueName: \"kubernetes.io/projected/16331e67-7ae2-4c7d-9d74-8603551671cf-kube-api-access-lcn2k\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354143 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-ffc6585f-srcxp"] Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354381 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354396 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354413 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354424 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354437 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbbc361-a522-4548-a14e-bdd061c7bc4b" containerName="registry" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354445 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbc361-a522-4548-a14e-bdd061c7bc4b" containerName="registry" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354457 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354466 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354477 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354485 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354499 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354507 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354519 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354527 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354539 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354548 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354562 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354570 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354581 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16331e67-7ae2-4c7d-9d74-8603551671cf" containerName="controller-manager" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354589 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="16331e67-7ae2-4c7d-9d74-8603551671cf" containerName="controller-manager" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354606 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354614 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="extract-utilities" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354624 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354632 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354645 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354654 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="extract-content" Dec 05 05:58:37 crc kubenswrapper[4865]: E1205 05:58:37.354671 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354679 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354816 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="582e42c0-b2d0-4b24-900e-1316a155c471" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354846 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b3fdc-c602-48f5-bc10-e2e30df8cc0e" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354863 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="748082c2-70ae-4b67-9c21-ff6f32030822" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354874 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe8803a-815d-4318-bfaa-1949755ed910" containerName="registry-server" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354884 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="16331e67-7ae2-4c7d-9d74-8603551671cf" containerName="controller-manager" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.354899 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbbc361-a522-4548-a14e-bdd061c7bc4b" containerName="registry" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.355322 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.367005 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" event={"ID":"16331e67-7ae2-4c7d-9d74-8603551671cf","Type":"ContainerDied","Data":"ebd4e0a671d0d9f53a8a13fc524a96aca51ad76635487553da9356e7ad8d775f"} Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.367077 4865 scope.go:117] "RemoveContainer" containerID="f7976a9fbdde913f1f054c111cdb6d16c497deab6019b790a38c1b405559bf38" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.367173 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bc9444ff5-5759d" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.375909 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ffc6585f-srcxp"] Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.466096 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bc9444ff5-5759d"] Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.469293 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bc9444ff5-5759d"] Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.523680 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqd5\" (UniqueName: \"kubernetes.io/projected/7fd9625c-2f0c-431f-96a8-6f1510ed3211-kube-api-access-tvqd5\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.523779 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-proxy-ca-bundles\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.523895 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-config\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.523966 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-client-ca\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.524039 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9625c-2f0c-431f-96a8-6f1510ed3211-serving-cert\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.625161 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-proxy-ca-bundles\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.625414 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-config\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.625531 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-client-ca\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.625674 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9625c-2f0c-431f-96a8-6f1510ed3211-serving-cert\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.625804 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqd5\" (UniqueName: \"kubernetes.io/projected/7fd9625c-2f0c-431f-96a8-6f1510ed3211-kube-api-access-tvqd5\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.626492 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-client-ca\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.626615 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-proxy-ca-bundles\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.626878 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9625c-2f0c-431f-96a8-6f1510ed3211-config\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.629647 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9625c-2f0c-431f-96a8-6f1510ed3211-serving-cert\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.651632 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqd5\" (UniqueName: \"kubernetes.io/projected/7fd9625c-2f0c-431f-96a8-6f1510ed3211-kube-api-access-tvqd5\") pod \"controller-manager-ffc6585f-srcxp\" (UID: \"7fd9625c-2f0c-431f-96a8-6f1510ed3211\") " pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.673187 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:37 crc kubenswrapper[4865]: I1205 05:58:37.973959 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ffc6585f-srcxp"] Dec 05 05:58:37 crc kubenswrapper[4865]: W1205 05:58:37.987934 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd9625c_2f0c_431f_96a8_6f1510ed3211.slice/crio-af7619651f32f8dbc2e6066ada072fc47f228ac64d26708db23d00816443ca1b WatchSource:0}: Error finding container af7619651f32f8dbc2e6066ada072fc47f228ac64d26708db23d00816443ca1b: Status 404 returned error can't find the container with id af7619651f32f8dbc2e6066ada072fc47f228ac64d26708db23d00816443ca1b Dec 05 05:58:38 crc kubenswrapper[4865]: I1205 05:58:38.373130 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" event={"ID":"7fd9625c-2f0c-431f-96a8-6f1510ed3211","Type":"ContainerStarted","Data":"4eb4763e448670393c891643afaaccfecbb809ecbc1a88080176badea6819ef8"} Dec 05 05:58:38 crc kubenswrapper[4865]: I1205 05:58:38.373172 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" event={"ID":"7fd9625c-2f0c-431f-96a8-6f1510ed3211","Type":"ContainerStarted","Data":"af7619651f32f8dbc2e6066ada072fc47f228ac64d26708db23d00816443ca1b"} Dec 05 05:58:38 crc kubenswrapper[4865]: I1205 05:58:38.376055 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:38 crc kubenswrapper[4865]: I1205 05:58:38.382474 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" Dec 05 05:58:38 crc kubenswrapper[4865]: I1205 05:58:38.449986 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-ffc6585f-srcxp" podStartSLOduration=2.449961756 podStartE2EDuration="2.449961756s" podCreationTimestamp="2025-12-05 05:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:58:38.443200065 +0000 UTC m=+337.723211297" watchObservedRunningTime="2025-12-05 05:58:38.449961756 +0000 UTC m=+337.729972978" Dec 05 05:58:39 crc kubenswrapper[4865]: I1205 05:58:39.013020 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16331e67-7ae2-4c7d-9d74-8603551671cf" path="/var/lib/kubelet/pods/16331e67-7ae2-4c7d-9d74-8603551671cf/volumes" Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.048834 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.049275 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.932355 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsjkk"] Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.932596 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fsjkk" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="registry-server" containerID="cri-o://7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583" gracePeriod=30 Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.946625 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdk79"] Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.946910 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bdk79" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="registry-server" containerID="cri-o://977be8a03e1dbfc0f779f1f6e113f33e7655f330bb6e84aaca5e05932fea6df6" gracePeriod=30 Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.958752 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8thws"] Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.958961 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" containerID="cri-o://68ab0d28476f44451dadc0f67a3d1bc6c68028ad03f8baca0545f8a47f6903e2" gracePeriod=30 Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.969297 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjg7p"] Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.969542 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jjg7p" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="registry-server" containerID="cri-o://94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a" gracePeriod=30 Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.980334 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xljcc"] Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.980575 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xljcc" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="registry-server" containerID="cri-o://70d23a6d79d5b5a207c36677970567804dd0168a20b9e0f576f951f830d7eaae" gracePeriod=30 Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.998591 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bnwqc"] Dec 05 05:58:41 crc kubenswrapper[4865]: I1205 05:58:41.999283 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.013417 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bnwqc"] Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.096419 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhwh\" (UniqueName: \"kubernetes.io/projected/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-kube-api-access-4fhwh\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.096466 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.096586 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.197932 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhwh\" (UniqueName: \"kubernetes.io/projected/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-kube-api-access-4fhwh\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.197978 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.198029 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.199310 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.209826 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.217099 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhwh\" (UniqueName: \"kubernetes.io/projected/688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d-kube-api-access-4fhwh\") pod \"marketplace-operator-79b997595-bnwqc\" (UID: \"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.320154 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:42 crc kubenswrapper[4865]: E1205 05:58:42.522487 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583 is running failed: container process not found" containerID="7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:58:42 crc kubenswrapper[4865]: E1205 05:58:42.524914 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583 is running failed: container process not found" containerID="7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:58:42 crc kubenswrapper[4865]: E1205 05:58:42.526130 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583 is running failed: container process not found" containerID="7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:58:42 crc kubenswrapper[4865]: E1205 05:58:42.526168 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-fsjkk" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="registry-server" Dec 05 05:58:42 crc kubenswrapper[4865]: I1205 05:58:42.759548 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bnwqc"] Dec 05 05:58:42 crc kubenswrapper[4865]: W1205 05:58:42.764333 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod688f2ae9_08ac_4eb2_9d51_d6c4f9c3be5d.slice/crio-2d9736eb13b84dc3016be0a38bd8a3b8570a638a48f216d4758edd0e5969fd9f WatchSource:0}: Error finding container 2d9736eb13b84dc3016be0a38bd8a3b8570a638a48f216d4758edd0e5969fd9f: Status 404 returned error can't find the container with id 2d9736eb13b84dc3016be0a38bd8a3b8570a638a48f216d4758edd0e5969fd9f Dec 05 05:58:43 crc kubenswrapper[4865]: I1205 05:58:43.403145 4865 generic.go:334] "Generic (PLEG): container finished" podID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerID="977be8a03e1dbfc0f779f1f6e113f33e7655f330bb6e84aaca5e05932fea6df6" exitCode=0 Dec 05 05:58:43 crc kubenswrapper[4865]: I1205 05:58:43.403224 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerDied","Data":"977be8a03e1dbfc0f779f1f6e113f33e7655f330bb6e84aaca5e05932fea6df6"} Dec 05 05:58:43 crc kubenswrapper[4865]: I1205 05:58:43.404340 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" event={"ID":"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d","Type":"ContainerStarted","Data":"2d9736eb13b84dc3016be0a38bd8a3b8570a638a48f216d4758edd0e5969fd9f"} Dec 05 05:58:43 crc kubenswrapper[4865]: E1205 05:58:43.593693 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a is running failed: container process not found" containerID="94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:58:43 crc kubenswrapper[4865]: E1205 05:58:43.594152 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a is running failed: container process not found" containerID="94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:58:43 crc kubenswrapper[4865]: E1205 05:58:43.594361 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a is running failed: container process not found" containerID="94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 05:58:43 crc kubenswrapper[4865]: E1205 05:58:43.594398 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-jjg7p" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="registry-server" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.421415 4865 generic.go:334] "Generic (PLEG): container finished" podID="fc0b366c-dba6-4a98-8335-e5434858e367" containerID="7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583" exitCode=0 Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.421728 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjkk" event={"ID":"fc0b366c-dba6-4a98-8335-e5434858e367","Type":"ContainerDied","Data":"7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583"} Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.425078 4865 generic.go:334] "Generic (PLEG): container finished" podID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerID="94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a" exitCode=0 Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.425123 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerDied","Data":"94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a"} Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.427052 4865 generic.go:334] "Generic (PLEG): container finished" podID="1f49a368-065d-4057-a044-a019eba9ce9e" containerID="68ab0d28476f44451dadc0f67a3d1bc6c68028ad03f8baca0545f8a47f6903e2" exitCode=0 Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.427100 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" event={"ID":"1f49a368-065d-4057-a044-a019eba9ce9e","Type":"ContainerDied","Data":"68ab0d28476f44451dadc0f67a3d1bc6c68028ad03f8baca0545f8a47f6903e2"} Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.427126 4865 scope.go:117] "RemoveContainer" containerID="420b599485c42d5136bba3e581869141e0c3970fbbaf3e429e65daa33707125d" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.441863 4865 generic.go:334] "Generic (PLEG): container finished" podID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerID="70d23a6d79d5b5a207c36677970567804dd0168a20b9e0f576f951f830d7eaae" exitCode=0 Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.441980 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerDied","Data":"70d23a6d79d5b5a207c36677970567804dd0168a20b9e0f576f951f830d7eaae"} Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.445136 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" event={"ID":"688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d","Type":"ContainerStarted","Data":"1adffb3aae30a2d7cf0ac63f85d5db8470b5dbb4d9468f7552bf7828e934e4b7"} Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.445877 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.455121 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.470287 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bnwqc" podStartSLOduration=3.470266293 podStartE2EDuration="3.470266293s" podCreationTimestamp="2025-12-05 05:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 05:58:44.467659669 +0000 UTC m=+343.747670911" watchObservedRunningTime="2025-12-05 05:58:44.470266293 +0000 UTC m=+343.750277515" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.566165 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.726718 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t8j9\" (UniqueName: \"kubernetes.io/projected/fc0b366c-dba6-4a98-8335-e5434858e367-kube-api-access-9t8j9\") pod \"fc0b366c-dba6-4a98-8335-e5434858e367\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.726884 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-utilities\") pod \"fc0b366c-dba6-4a98-8335-e5434858e367\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.726942 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-catalog-content\") pod \"fc0b366c-dba6-4a98-8335-e5434858e367\" (UID: \"fc0b366c-dba6-4a98-8335-e5434858e367\") " Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.728057 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-utilities" (OuterVolumeSpecName: "utilities") pod "fc0b366c-dba6-4a98-8335-e5434858e367" (UID: "fc0b366c-dba6-4a98-8335-e5434858e367"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.768399 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc0b366c-dba6-4a98-8335-e5434858e367-kube-api-access-9t8j9" (OuterVolumeSpecName: "kube-api-access-9t8j9") pod "fc0b366c-dba6-4a98-8335-e5434858e367" (UID: "fc0b366c-dba6-4a98-8335-e5434858e367"). InnerVolumeSpecName "kube-api-access-9t8j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.801458 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc0b366c-dba6-4a98-8335-e5434858e367" (UID: "fc0b366c-dba6-4a98-8335-e5434858e367"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.812016 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.828050 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t8j9\" (UniqueName: \"kubernetes.io/projected/fc0b366c-dba6-4a98-8335-e5434858e367-kube-api-access-9t8j9\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.828081 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.828093 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc0b366c-dba6-4a98-8335-e5434858e367-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.930493 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-utilities\") pod \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.930649 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prrgt\" (UniqueName: \"kubernetes.io/projected/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-kube-api-access-prrgt\") pod \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.930688 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-catalog-content\") pod \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\" (UID: \"04a4a0fc-43e5-4409-a8e5-bfa4b2525322\") " Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.934351 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-utilities" (OuterVolumeSpecName: "utilities") pod "04a4a0fc-43e5-4409-a8e5-bfa4b2525322" (UID: "04a4a0fc-43e5-4409-a8e5-bfa4b2525322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:44 crc kubenswrapper[4865]: I1205 05:58:44.936494 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-kube-api-access-prrgt" (OuterVolumeSpecName: "kube-api-access-prrgt") pod "04a4a0fc-43e5-4409-a8e5-bfa4b2525322" (UID: "04a4a0fc-43e5-4409-a8e5-bfa4b2525322"). InnerVolumeSpecName "kube-api-access-prrgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.031679 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.031709 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prrgt\" (UniqueName: \"kubernetes.io/projected/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-kube-api-access-prrgt\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.038665 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.051974 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.080084 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.080695 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04a4a0fc-43e5-4409-a8e5-bfa4b2525322" (UID: "04a4a0fc-43e5-4409-a8e5-bfa4b2525322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.132581 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-utilities\") pod \"f43580fe-7567-4fb7-b1fc-203bda11942a\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.132721 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbw8f\" (UniqueName: \"kubernetes.io/projected/f43580fe-7567-4fb7-b1fc-203bda11942a-kube-api-access-sbw8f\") pod \"f43580fe-7567-4fb7-b1fc-203bda11942a\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.132749 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-catalog-content\") pod \"f43580fe-7567-4fb7-b1fc-203bda11942a\" (UID: \"f43580fe-7567-4fb7-b1fc-203bda11942a\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.133013 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a4a0fc-43e5-4409-a8e5-bfa4b2525322-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.137087 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-utilities" (OuterVolumeSpecName: "utilities") pod "f43580fe-7567-4fb7-b1fc-203bda11942a" (UID: "f43580fe-7567-4fb7-b1fc-203bda11942a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.139653 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43580fe-7567-4fb7-b1fc-203bda11942a-kube-api-access-sbw8f" (OuterVolumeSpecName: "kube-api-access-sbw8f") pod "f43580fe-7567-4fb7-b1fc-203bda11942a" (UID: "f43580fe-7567-4fb7-b1fc-203bda11942a"). InnerVolumeSpecName "kube-api-access-sbw8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.152565 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f43580fe-7567-4fb7-b1fc-203bda11942a" (UID: "f43580fe-7567-4fb7-b1fc-203bda11942a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.233804 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-operator-metrics\") pod \"1f49a368-065d-4057-a044-a019eba9ce9e\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234199 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-catalog-content\") pod \"8ed66f14-1ac8-456b-b3bb-d909c0164767\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234231 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-trusted-ca\") pod \"1f49a368-065d-4057-a044-a019eba9ce9e\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234273 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjbll\" (UniqueName: \"kubernetes.io/projected/8ed66f14-1ac8-456b-b3bb-d909c0164767-kube-api-access-bjbll\") pod \"8ed66f14-1ac8-456b-b3bb-d909c0164767\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234312 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph2jj\" (UniqueName: \"kubernetes.io/projected/1f49a368-065d-4057-a044-a019eba9ce9e-kube-api-access-ph2jj\") pod \"1f49a368-065d-4057-a044-a019eba9ce9e\" (UID: \"1f49a368-065d-4057-a044-a019eba9ce9e\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234341 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-utilities\") pod \"8ed66f14-1ac8-456b-b3bb-d909c0164767\" (UID: \"8ed66f14-1ac8-456b-b3bb-d909c0164767\") " Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234586 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbw8f\" (UniqueName: \"kubernetes.io/projected/f43580fe-7567-4fb7-b1fc-203bda11942a-kube-api-access-sbw8f\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234599 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.234608 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43580fe-7567-4fb7-b1fc-203bda11942a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.235173 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1f49a368-065d-4057-a044-a019eba9ce9e" (UID: "1f49a368-065d-4057-a044-a019eba9ce9e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.235312 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-utilities" (OuterVolumeSpecName: "utilities") pod "8ed66f14-1ac8-456b-b3bb-d909c0164767" (UID: "8ed66f14-1ac8-456b-b3bb-d909c0164767"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.237907 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed66f14-1ac8-456b-b3bb-d909c0164767-kube-api-access-bjbll" (OuterVolumeSpecName: "kube-api-access-bjbll") pod "8ed66f14-1ac8-456b-b3bb-d909c0164767" (UID: "8ed66f14-1ac8-456b-b3bb-d909c0164767"). InnerVolumeSpecName "kube-api-access-bjbll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.238746 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f49a368-065d-4057-a044-a019eba9ce9e-kube-api-access-ph2jj" (OuterVolumeSpecName: "kube-api-access-ph2jj") pod "1f49a368-065d-4057-a044-a019eba9ce9e" (UID: "1f49a368-065d-4057-a044-a019eba9ce9e"). InnerVolumeSpecName "kube-api-access-ph2jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.241233 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1f49a368-065d-4057-a044-a019eba9ce9e" (UID: "1f49a368-065d-4057-a044-a019eba9ce9e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.292960 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ed66f14-1ac8-456b-b3bb-d909c0164767" (UID: "8ed66f14-1ac8-456b-b3bb-d909c0164767"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.336088 4865 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.336124 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.336132 4865 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f49a368-065d-4057-a044-a019eba9ce9e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.336142 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjbll\" (UniqueName: \"kubernetes.io/projected/8ed66f14-1ac8-456b-b3bb-d909c0164767-kube-api-access-bjbll\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.336150 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph2jj\" (UniqueName: \"kubernetes.io/projected/1f49a368-065d-4057-a044-a019eba9ce9e-kube-api-access-ph2jj\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.336161 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed66f14-1ac8-456b-b3bb-d909c0164767-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.456159 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjg7p" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.456170 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjg7p" event={"ID":"f43580fe-7567-4fb7-b1fc-203bda11942a","Type":"ContainerDied","Data":"48b466c65c1407c75f36ab9823eb5e3dbd882a4917f42a3a9e7f16bebed84e43"} Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.457376 4865 scope.go:117] "RemoveContainer" containerID="94759b43bbee15d0b8ce41e22074feb3cd9c47791e46bb06a34ebf4495aa179a" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.459756 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdk79" event={"ID":"8ed66f14-1ac8-456b-b3bb-d909c0164767","Type":"ContainerDied","Data":"6f0bd87a1ca4479772d2d0bcfccbd9bfccf93e0ccca1c150d7681d30c8004184"} Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.460384 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdk79" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.461426 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.461541 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8thws" event={"ID":"1f49a368-065d-4057-a044-a019eba9ce9e","Type":"ContainerDied","Data":"1f3ab2feb01855bef43c2cbb2b494a7ec28e6428d9d18f10664106b67e7eb194"} Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.465104 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xljcc" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.465091 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xljcc" event={"ID":"04a4a0fc-43e5-4409-a8e5-bfa4b2525322","Type":"ContainerDied","Data":"e3a74d3a20f449c76a1ee7645a9553a8c73e26f3b3fe573259dbafa95ccb8c43"} Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.472054 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjkk" event={"ID":"fc0b366c-dba6-4a98-8335-e5434858e367","Type":"ContainerDied","Data":"866030eaba4643461f3e17884d5436967d7d0e7fd465026e53fc7f73e1224b78"} Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.472357 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjkk" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.493969 4865 scope.go:117] "RemoveContainer" containerID="e72dfa1bf633d205c717c4bf710c9ee04dabe452a7ecdca7037a2e683c708f97" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.506441 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsjkk"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.516638 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fsjkk"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.531958 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8thws"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.537859 4865 scope.go:117] "RemoveContainer" containerID="cf543d72370a72d55aa4e8c98442513b27a05de901e72cb03c4d3aebd5698f68" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.543266 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8thws"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.548531 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdk79"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.556053 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bdk79"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.565201 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xljcc"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.569382 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xljcc"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.571489 4865 scope.go:117] "RemoveContainer" containerID="977be8a03e1dbfc0f779f1f6e113f33e7655f330bb6e84aaca5e05932fea6df6" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.573993 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjg7p"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.576539 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjg7p"] Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.583635 4865 scope.go:117] "RemoveContainer" containerID="c963e1428a698cf16325285b9b17d4c3d8ab4797a22e9e9c9dc2fdd02caa6c77" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.602106 4865 scope.go:117] "RemoveContainer" containerID="66d18640f9d306b894c65efc4cae963b490435ddfb4178bc3cea859ed1eb904e" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.618531 4865 scope.go:117] "RemoveContainer" containerID="68ab0d28476f44451dadc0f67a3d1bc6c68028ad03f8baca0545f8a47f6903e2" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.637242 4865 scope.go:117] "RemoveContainer" containerID="70d23a6d79d5b5a207c36677970567804dd0168a20b9e0f576f951f830d7eaae" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.653388 4865 scope.go:117] "RemoveContainer" containerID="f5cf0d4b49e34fc58e891af15f631790d507b6ac2f15c09a3c2df944ee623757" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.669437 4865 scope.go:117] "RemoveContainer" containerID="5775199bc83ffa7b87ff62243a5ad5dc2f5bec21d50ac73175d93f92060dc9cd" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.681542 4865 scope.go:117] "RemoveContainer" containerID="7fa1a585ae25b8f9f781f30b371806c330383d617a00d95ae34dac617dac0583" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.693603 4865 scope.go:117] "RemoveContainer" containerID="6d16f849561442a73dc55c4f9b41f4cd0c34477750439c735b90057a99219344" Dec 05 05:58:45 crc kubenswrapper[4865]: I1205 05:58:45.708149 4865 scope.go:117] "RemoveContainer" containerID="e8db9b028a2622ff385660a93281e21c61c7ac09f0e4e31755a5b4208700541c" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.011966 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" path="/var/lib/kubelet/pods/04a4a0fc-43e5-4409-a8e5-bfa4b2525322/volumes" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.012890 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" path="/var/lib/kubelet/pods/1f49a368-065d-4057-a044-a019eba9ce9e/volumes" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.013333 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" path="/var/lib/kubelet/pods/8ed66f14-1ac8-456b-b3bb-d909c0164767/volumes" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.013918 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" path="/var/lib/kubelet/pods/f43580fe-7567-4fb7-b1fc-203bda11942a/volumes" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.014454 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" path="/var/lib/kubelet/pods/fc0b366c-dba6-4a98-8335-e5434858e367/volumes" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169252 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f4cmh"] Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169443 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169455 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169467 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169473 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169481 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169488 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169498 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169504 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169512 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169518 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169530 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169535 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169546 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169553 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169560 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169566 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169576 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169581 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169589 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169595 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169603 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169609 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="extract-utilities" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169616 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169621 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169628 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169634 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="extract-content" Dec 05 05:58:47 crc kubenswrapper[4865]: E1205 05:58:47.169641 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169647 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169736 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc0b366c-dba6-4a98-8335-e5434858e367" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169748 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169759 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f49a368-065d-4057-a044-a019eba9ce9e" containerName="marketplace-operator" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169767 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a4a0fc-43e5-4409-a8e5-bfa4b2525322" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169776 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed66f14-1ac8-456b-b3bb-d909c0164767" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.169786 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43580fe-7567-4fb7-b1fc-203bda11942a" containerName="registry-server" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.170525 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.173062 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4cmh"] Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.174547 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.362303 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-46kd4"] Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.363637 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.366935 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.367880 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa63a1a-0917-4c28-9307-4580800618e2-catalog-content\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.367914 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4sq6\" (UniqueName: \"kubernetes.io/projected/4aa63a1a-0917-4c28-9307-4580800618e2-kube-api-access-p4sq6\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.367960 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa63a1a-0917-4c28-9307-4580800618e2-utilities\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.371120 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-46kd4"] Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.469472 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa63a1a-0917-4c28-9307-4580800618e2-catalog-content\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.469550 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4sq6\" (UniqueName: \"kubernetes.io/projected/4aa63a1a-0917-4c28-9307-4580800618e2-kube-api-access-p4sq6\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.469623 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sks6n\" (UniqueName: \"kubernetes.io/projected/f1f44df5-e37a-4e11-995d-91a6d2fc538d-kube-api-access-sks6n\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.469651 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f44df5-e37a-4e11-995d-91a6d2fc538d-utilities\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.470043 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa63a1a-0917-4c28-9307-4580800618e2-catalog-content\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.470331 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f44df5-e37a-4e11-995d-91a6d2fc538d-catalog-content\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.470394 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa63a1a-0917-4c28-9307-4580800618e2-utilities\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.470695 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa63a1a-0917-4c28-9307-4580800618e2-utilities\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.488600 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4sq6\" (UniqueName: \"kubernetes.io/projected/4aa63a1a-0917-4c28-9307-4580800618e2-kube-api-access-p4sq6\") pod \"redhat-operators-f4cmh\" (UID: \"4aa63a1a-0917-4c28-9307-4580800618e2\") " pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.571540 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sks6n\" (UniqueName: \"kubernetes.io/projected/f1f44df5-e37a-4e11-995d-91a6d2fc538d-kube-api-access-sks6n\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.571582 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f44df5-e37a-4e11-995d-91a6d2fc538d-utilities\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.571618 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f44df5-e37a-4e11-995d-91a6d2fc538d-catalog-content\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.572081 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f44df5-e37a-4e11-995d-91a6d2fc538d-utilities\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.572100 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f44df5-e37a-4e11-995d-91a6d2fc538d-catalog-content\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.587333 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sks6n\" (UniqueName: \"kubernetes.io/projected/f1f44df5-e37a-4e11-995d-91a6d2fc538d-kube-api-access-sks6n\") pod \"redhat-marketplace-46kd4\" (UID: \"f1f44df5-e37a-4e11-995d-91a6d2fc538d\") " pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.683851 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:47 crc kubenswrapper[4865]: I1205 05:58:47.787254 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.094052 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-46kd4"] Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.198730 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4cmh"] Dec 05 05:58:48 crc kubenswrapper[4865]: W1205 05:58:48.202967 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa63a1a_0917_4c28_9307_4580800618e2.slice/crio-997c1b7f078b60685c21d0040d108316765d3a7fce086dab9b68ece16eb7ab7c WatchSource:0}: Error finding container 997c1b7f078b60685c21d0040d108316765d3a7fce086dab9b68ece16eb7ab7c: Status 404 returned error can't find the container with id 997c1b7f078b60685c21d0040d108316765d3a7fce086dab9b68ece16eb7ab7c Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.492201 4865 generic.go:334] "Generic (PLEG): container finished" podID="f1f44df5-e37a-4e11-995d-91a6d2fc538d" containerID="83d4c48daadd39495ca815119cbd227e0b451045c2a314ae1d5a0fff777ba9e6" exitCode=0 Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.492268 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46kd4" event={"ID":"f1f44df5-e37a-4e11-995d-91a6d2fc538d","Type":"ContainerDied","Data":"83d4c48daadd39495ca815119cbd227e0b451045c2a314ae1d5a0fff777ba9e6"} Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.492332 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46kd4" event={"ID":"f1f44df5-e37a-4e11-995d-91a6d2fc538d","Type":"ContainerStarted","Data":"76cab9fd849e38e4c32b0f46f9742a609b0a2901309e19f6056004fa3258fddf"} Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.495968 4865 generic.go:334] "Generic (PLEG): container finished" podID="4aa63a1a-0917-4c28-9307-4580800618e2" containerID="95faa46f7b43c65542c3fce8f1ba5a226c61043e3ec98977dd2c7a14de44409a" exitCode=0 Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.496119 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4cmh" event={"ID":"4aa63a1a-0917-4c28-9307-4580800618e2","Type":"ContainerDied","Data":"95faa46f7b43c65542c3fce8f1ba5a226c61043e3ec98977dd2c7a14de44409a"} Dec 05 05:58:48 crc kubenswrapper[4865]: I1205 05:58:48.497610 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4cmh" event={"ID":"4aa63a1a-0917-4c28-9307-4580800618e2","Type":"ContainerStarted","Data":"997c1b7f078b60685c21d0040d108316765d3a7fce086dab9b68ece16eb7ab7c"} Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.501883 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4cmh" event={"ID":"4aa63a1a-0917-4c28-9307-4580800618e2","Type":"ContainerStarted","Data":"0a5e9bafec76849c55dc0aaadb1d580f018ad04ae4ee1b3c0742e5df4998ce01"} Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.505879 4865 generic.go:334] "Generic (PLEG): container finished" podID="f1f44df5-e37a-4e11-995d-91a6d2fc538d" containerID="77b32a268e16edff2ab079ca7fcba60ecf35f1897e2ab2723cac915cf0ef5bf6" exitCode=0 Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.505913 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46kd4" event={"ID":"f1f44df5-e37a-4e11-995d-91a6d2fc538d","Type":"ContainerDied","Data":"77b32a268e16edff2ab079ca7fcba60ecf35f1897e2ab2723cac915cf0ef5bf6"} Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.572788 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jjwkp"] Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.573847 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.576111 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jjwkp"] Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.577950 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.596795 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa31bc96-9494-4d49-b271-7059d1a6d0e0-catalog-content\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.596865 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa31bc96-9494-4d49-b271-7059d1a6d0e0-utilities\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.596892 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2d7m\" (UniqueName: \"kubernetes.io/projected/fa31bc96-9494-4d49-b271-7059d1a6d0e0-kube-api-access-x2d7m\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.698542 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa31bc96-9494-4d49-b271-7059d1a6d0e0-catalog-content\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.698602 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa31bc96-9494-4d49-b271-7059d1a6d0e0-utilities\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.698637 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2d7m\" (UniqueName: \"kubernetes.io/projected/fa31bc96-9494-4d49-b271-7059d1a6d0e0-kube-api-access-x2d7m\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.699108 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa31bc96-9494-4d49-b271-7059d1a6d0e0-catalog-content\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.699245 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa31bc96-9494-4d49-b271-7059d1a6d0e0-utilities\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.717072 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2d7m\" (UniqueName: \"kubernetes.io/projected/fa31bc96-9494-4d49-b271-7059d1a6d0e0-kube-api-access-x2d7m\") pod \"community-operators-jjwkp\" (UID: \"fa31bc96-9494-4d49-b271-7059d1a6d0e0\") " pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.756953 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vvxh5"] Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.757939 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.760332 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.799881 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-utilities\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.799978 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cr7\" (UniqueName: \"kubernetes.io/projected/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-kube-api-access-t7cr7\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.800014 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-catalog-content\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.811504 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vvxh5"] Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.894234 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.900769 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-utilities\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.900836 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cr7\" (UniqueName: \"kubernetes.io/projected/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-kube-api-access-t7cr7\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.900868 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-catalog-content\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.901200 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-utilities\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.901219 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-catalog-content\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:49 crc kubenswrapper[4865]: I1205 05:58:49.919699 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cr7\" (UniqueName: \"kubernetes.io/projected/be94c16b-4a9a-4ad6-aafc-9879b95fdce6-kube-api-access-t7cr7\") pod \"certified-operators-vvxh5\" (UID: \"be94c16b-4a9a-4ad6-aafc-9879b95fdce6\") " pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.076543 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.361519 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jjwkp"] Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.514306 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46kd4" event={"ID":"f1f44df5-e37a-4e11-995d-91a6d2fc538d","Type":"ContainerStarted","Data":"0ab55f3f32ab12bbdff6ab80390601ad08167969f02738976b5056a277d489bc"} Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.516078 4865 generic.go:334] "Generic (PLEG): container finished" podID="4aa63a1a-0917-4c28-9307-4580800618e2" containerID="0a5e9bafec76849c55dc0aaadb1d580f018ad04ae4ee1b3c0742e5df4998ce01" exitCode=0 Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.516141 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4cmh" event={"ID":"4aa63a1a-0917-4c28-9307-4580800618e2","Type":"ContainerDied","Data":"0a5e9bafec76849c55dc0aaadb1d580f018ad04ae4ee1b3c0742e5df4998ce01"} Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.517000 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjwkp" event={"ID":"fa31bc96-9494-4d49-b271-7059d1a6d0e0","Type":"ContainerStarted","Data":"e0b22765396a0bbfa6e9944c351a985b781cc647ce37303ce349c03bcedbb3a2"} Dec 05 05:58:50 crc kubenswrapper[4865]: I1205 05:58:50.607665 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vvxh5"] Dec 05 05:58:50 crc kubenswrapper[4865]: W1205 05:58:50.609234 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe94c16b_4a9a_4ad6_aafc_9879b95fdce6.slice/crio-8e136e7dd2bc3ef04f85134839324f680c6a8a524f32eb1f8952ba45ae103703 WatchSource:0}: Error finding container 8e136e7dd2bc3ef04f85134839324f680c6a8a524f32eb1f8952ba45ae103703: Status 404 returned error can't find the container with id 8e136e7dd2bc3ef04f85134839324f680c6a8a524f32eb1f8952ba45ae103703 Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.524886 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4cmh" event={"ID":"4aa63a1a-0917-4c28-9307-4580800618e2","Type":"ContainerStarted","Data":"7d3f2c24a5ac02fb3acc70146a7c8d943cfc6c640eeeea4e2bdacfc6e2683f1e"} Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.526116 4865 generic.go:334] "Generic (PLEG): container finished" podID="fa31bc96-9494-4d49-b271-7059d1a6d0e0" containerID="da50ac0348991667499670c6247b4a7b6f97a9c8066f5ef98a4f7ee41f73d19b" exitCode=0 Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.526183 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjwkp" event={"ID":"fa31bc96-9494-4d49-b271-7059d1a6d0e0","Type":"ContainerDied","Data":"da50ac0348991667499670c6247b4a7b6f97a9c8066f5ef98a4f7ee41f73d19b"} Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.527524 4865 generic.go:334] "Generic (PLEG): container finished" podID="be94c16b-4a9a-4ad6-aafc-9879b95fdce6" containerID="02b26f117fe2b1f4eabcfde7aecc255781b810ac3b73befd4dfb2eafa9028a1a" exitCode=0 Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.527766 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvxh5" event={"ID":"be94c16b-4a9a-4ad6-aafc-9879b95fdce6","Type":"ContainerDied","Data":"02b26f117fe2b1f4eabcfde7aecc255781b810ac3b73befd4dfb2eafa9028a1a"} Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.527813 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvxh5" event={"ID":"be94c16b-4a9a-4ad6-aafc-9879b95fdce6","Type":"ContainerStarted","Data":"8e136e7dd2bc3ef04f85134839324f680c6a8a524f32eb1f8952ba45ae103703"} Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.544002 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f4cmh" podStartSLOduration=1.950846981 podStartE2EDuration="4.543986707s" podCreationTimestamp="2025-12-05 05:58:47 +0000 UTC" firstStartedPulling="2025-12-05 05:58:48.497656916 +0000 UTC m=+347.777668128" lastFinishedPulling="2025-12-05 05:58:51.090796632 +0000 UTC m=+350.370807854" observedRunningTime="2025-12-05 05:58:51.542665199 +0000 UTC m=+350.822676421" watchObservedRunningTime="2025-12-05 05:58:51.543986707 +0000 UTC m=+350.823997929" Dec 05 05:58:51 crc kubenswrapper[4865]: I1205 05:58:51.564409 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-46kd4" podStartSLOduration=2.858101337 podStartE2EDuration="4.564390073s" podCreationTimestamp="2025-12-05 05:58:47 +0000 UTC" firstStartedPulling="2025-12-05 05:58:48.493452397 +0000 UTC m=+347.773463619" lastFinishedPulling="2025-12-05 05:58:50.199741123 +0000 UTC m=+349.479752355" observedRunningTime="2025-12-05 05:58:51.558940809 +0000 UTC m=+350.838952041" watchObservedRunningTime="2025-12-05 05:58:51.564390073 +0000 UTC m=+350.844401295" Dec 05 05:58:52 crc kubenswrapper[4865]: I1205 05:58:52.534314 4865 generic.go:334] "Generic (PLEG): container finished" podID="be94c16b-4a9a-4ad6-aafc-9879b95fdce6" containerID="80d2a599941b83df8029b1fd38029a05c0720a53139668004918e39c3f0447b9" exitCode=0 Dec 05 05:58:52 crc kubenswrapper[4865]: I1205 05:58:52.534381 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvxh5" event={"ID":"be94c16b-4a9a-4ad6-aafc-9879b95fdce6","Type":"ContainerDied","Data":"80d2a599941b83df8029b1fd38029a05c0720a53139668004918e39c3f0447b9"} Dec 05 05:58:52 crc kubenswrapper[4865]: I1205 05:58:52.536797 4865 generic.go:334] "Generic (PLEG): container finished" podID="fa31bc96-9494-4d49-b271-7059d1a6d0e0" containerID="77f17e8c3ed98bb2cd0b36b94ed14dc5bf361e0d7c889992a9879ae964076a48" exitCode=0 Dec 05 05:58:52 crc kubenswrapper[4865]: I1205 05:58:52.536975 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjwkp" event={"ID":"fa31bc96-9494-4d49-b271-7059d1a6d0e0","Type":"ContainerDied","Data":"77f17e8c3ed98bb2cd0b36b94ed14dc5bf361e0d7c889992a9879ae964076a48"} Dec 05 05:58:54 crc kubenswrapper[4865]: I1205 05:58:54.548404 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjwkp" event={"ID":"fa31bc96-9494-4d49-b271-7059d1a6d0e0","Type":"ContainerStarted","Data":"7fa59f0eac772d9d8b06b442b5992385008b53dd12c49b1b012ec0ff201711c4"} Dec 05 05:58:54 crc kubenswrapper[4865]: I1205 05:58:54.551260 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvxh5" event={"ID":"be94c16b-4a9a-4ad6-aafc-9879b95fdce6","Type":"ContainerStarted","Data":"2314f57e108cafeb5fa0e907496f21fa2ffefa8bfa380848bc52c92566be85ae"} Dec 05 05:58:54 crc kubenswrapper[4865]: I1205 05:58:54.569313 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jjwkp" podStartSLOduration=4.153290041 podStartE2EDuration="5.569296004s" podCreationTimestamp="2025-12-05 05:58:49 +0000 UTC" firstStartedPulling="2025-12-05 05:58:51.527769908 +0000 UTC m=+350.807781130" lastFinishedPulling="2025-12-05 05:58:52.943775871 +0000 UTC m=+352.223787093" observedRunningTime="2025-12-05 05:58:54.568974015 +0000 UTC m=+353.848985227" watchObservedRunningTime="2025-12-05 05:58:54.569296004 +0000 UTC m=+353.849307226" Dec 05 05:58:54 crc kubenswrapper[4865]: I1205 05:58:54.588366 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vvxh5" podStartSLOduration=4.227020725 podStartE2EDuration="5.588349503s" podCreationTimestamp="2025-12-05 05:58:49 +0000 UTC" firstStartedPulling="2025-12-05 05:58:51.529041164 +0000 UTC m=+350.809052386" lastFinishedPulling="2025-12-05 05:58:52.890369942 +0000 UTC m=+352.170381164" observedRunningTime="2025-12-05 05:58:54.587460988 +0000 UTC m=+353.867472210" watchObservedRunningTime="2025-12-05 05:58:54.588349503 +0000 UTC m=+353.868360725" Dec 05 05:58:57 crc kubenswrapper[4865]: I1205 05:58:57.707417 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:57 crc kubenswrapper[4865]: I1205 05:58:57.708145 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:57 crc kubenswrapper[4865]: I1205 05:58:57.765377 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:57 crc kubenswrapper[4865]: I1205 05:58:57.788697 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:57 crc kubenswrapper[4865]: I1205 05:58:57.788795 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:57 crc kubenswrapper[4865]: I1205 05:58:57.828197 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:58 crc kubenswrapper[4865]: I1205 05:58:58.614298 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f4cmh" Dec 05 05:58:58 crc kubenswrapper[4865]: I1205 05:58:58.614637 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-46kd4" Dec 05 05:58:59 crc kubenswrapper[4865]: I1205 05:58:59.895278 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:59 crc kubenswrapper[4865]: I1205 05:58:59.896172 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:58:59 crc kubenswrapper[4865]: I1205 05:58:59.932812 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:59:00 crc kubenswrapper[4865]: I1205 05:59:00.077075 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:59:00 crc kubenswrapper[4865]: I1205 05:59:00.077149 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:59:00 crc kubenswrapper[4865]: I1205 05:59:00.120008 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:59:00 crc kubenswrapper[4865]: I1205 05:59:00.622898 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vvxh5" Dec 05 05:59:00 crc kubenswrapper[4865]: I1205 05:59:00.634475 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jjwkp" Dec 05 05:59:11 crc kubenswrapper[4865]: I1205 05:59:11.049148 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 05:59:11 crc kubenswrapper[4865]: I1205 05:59:11.050041 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.049686 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.050717 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.050781 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.051547 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c37cd466671a814dc7fd213e210192f3341c2133e9a2d5a7ced242665a144318"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.051607 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://c37cd466671a814dc7fd213e210192f3341c2133e9a2d5a7ced242665a144318" gracePeriod=600 Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.920532 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="c37cd466671a814dc7fd213e210192f3341c2133e9a2d5a7ced242665a144318" exitCode=0 Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.920624 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"c37cd466671a814dc7fd213e210192f3341c2133e9a2d5a7ced242665a144318"} Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.921484 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"0fa076b7876af986bac2b8667cbf6d275c93b22d832f82ad2c83ef7e91ad5c2a"} Dec 05 05:59:41 crc kubenswrapper[4865]: I1205 05:59:41.921515 4865 scope.go:117] "RemoveContainer" containerID="1da390b15af25b9223a372681201798d719c48662ab76913d773a35198260faf" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.176447 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4"] Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.178386 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.180498 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.181694 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.192064 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4"] Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.320493 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1384ef4-f086-4d99-92af-ed79b1e25ac8-secret-volume\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.320566 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1384ef4-f086-4d99-92af-ed79b1e25ac8-config-volume\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.320598 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-978cl\" (UniqueName: \"kubernetes.io/projected/c1384ef4-f086-4d99-92af-ed79b1e25ac8-kube-api-access-978cl\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.422380 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1384ef4-f086-4d99-92af-ed79b1e25ac8-secret-volume\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.422739 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1384ef4-f086-4d99-92af-ed79b1e25ac8-config-volume\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.422939 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-978cl\" (UniqueName: \"kubernetes.io/projected/c1384ef4-f086-4d99-92af-ed79b1e25ac8-kube-api-access-978cl\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.423738 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1384ef4-f086-4d99-92af-ed79b1e25ac8-config-volume\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.439999 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1384ef4-f086-4d99-92af-ed79b1e25ac8-secret-volume\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.445776 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-978cl\" (UniqueName: \"kubernetes.io/projected/c1384ef4-f086-4d99-92af-ed79b1e25ac8-kube-api-access-978cl\") pod \"collect-profiles-29415240-ds7p4\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.497608 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:00 crc kubenswrapper[4865]: I1205 06:00:00.724612 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4"] Dec 05 06:00:00 crc kubenswrapper[4865]: W1205 06:00:00.736310 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1384ef4_f086_4d99_92af_ed79b1e25ac8.slice/crio-2a89d2e6f2cea79cff2aa0fe1ea21e73d549ff87f01a684d9007abf2acd87792 WatchSource:0}: Error finding container 2a89d2e6f2cea79cff2aa0fe1ea21e73d549ff87f01a684d9007abf2acd87792: Status 404 returned error can't find the container with id 2a89d2e6f2cea79cff2aa0fe1ea21e73d549ff87f01a684d9007abf2acd87792 Dec 05 06:00:01 crc kubenswrapper[4865]: I1205 06:00:01.057816 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1384ef4-f086-4d99-92af-ed79b1e25ac8" containerID="b06dd3f9ba6d11b579453c441f4b36af739a6c6294319197f5113532322064df" exitCode=0 Dec 05 06:00:01 crc kubenswrapper[4865]: I1205 06:00:01.057869 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" event={"ID":"c1384ef4-f086-4d99-92af-ed79b1e25ac8","Type":"ContainerDied","Data":"b06dd3f9ba6d11b579453c441f4b36af739a6c6294319197f5113532322064df"} Dec 05 06:00:01 crc kubenswrapper[4865]: I1205 06:00:01.057908 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" event={"ID":"c1384ef4-f086-4d99-92af-ed79b1e25ac8","Type":"ContainerStarted","Data":"2a89d2e6f2cea79cff2aa0fe1ea21e73d549ff87f01a684d9007abf2acd87792"} Dec 05 06:00:01 crc kubenswrapper[4865]: I1205 06:00:01.577502 4865 scope.go:117] "RemoveContainer" containerID="9a3faeca9dc31aa6a02c4a07a9d9ae868dc05e51758895ac25b80a3ee2b6d5ca" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.280248 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.447341 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-978cl\" (UniqueName: \"kubernetes.io/projected/c1384ef4-f086-4d99-92af-ed79b1e25ac8-kube-api-access-978cl\") pod \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.447421 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1384ef4-f086-4d99-92af-ed79b1e25ac8-secret-volume\") pod \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.447453 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1384ef4-f086-4d99-92af-ed79b1e25ac8-config-volume\") pod \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\" (UID: \"c1384ef4-f086-4d99-92af-ed79b1e25ac8\") " Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.448160 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1384ef4-f086-4d99-92af-ed79b1e25ac8-config-volume" (OuterVolumeSpecName: "config-volume") pod "c1384ef4-f086-4d99-92af-ed79b1e25ac8" (UID: "c1384ef4-f086-4d99-92af-ed79b1e25ac8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.452336 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1384ef4-f086-4d99-92af-ed79b1e25ac8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c1384ef4-f086-4d99-92af-ed79b1e25ac8" (UID: "c1384ef4-f086-4d99-92af-ed79b1e25ac8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.453425 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1384ef4-f086-4d99-92af-ed79b1e25ac8-kube-api-access-978cl" (OuterVolumeSpecName: "kube-api-access-978cl") pod "c1384ef4-f086-4d99-92af-ed79b1e25ac8" (UID: "c1384ef4-f086-4d99-92af-ed79b1e25ac8"). InnerVolumeSpecName "kube-api-access-978cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.548573 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-978cl\" (UniqueName: \"kubernetes.io/projected/c1384ef4-f086-4d99-92af-ed79b1e25ac8-kube-api-access-978cl\") on node \"crc\" DevicePath \"\"" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.548602 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1384ef4-f086-4d99-92af-ed79b1e25ac8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:00:02 crc kubenswrapper[4865]: I1205 06:00:02.548612 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1384ef4-f086-4d99-92af-ed79b1e25ac8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:00:03 crc kubenswrapper[4865]: I1205 06:00:03.073370 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" event={"ID":"c1384ef4-f086-4d99-92af-ed79b1e25ac8","Type":"ContainerDied","Data":"2a89d2e6f2cea79cff2aa0fe1ea21e73d549ff87f01a684d9007abf2acd87792"} Dec 05 06:00:03 crc kubenswrapper[4865]: I1205 06:00:03.074911 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a89d2e6f2cea79cff2aa0fe1ea21e73d549ff87f01a684d9007abf2acd87792" Dec 05 06:00:03 crc kubenswrapper[4865]: I1205 06:00:03.073420 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4" Dec 05 06:01:41 crc kubenswrapper[4865]: I1205 06:01:41.049614 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:01:41 crc kubenswrapper[4865]: I1205 06:01:41.050657 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:02:11 crc kubenswrapper[4865]: I1205 06:02:11.049159 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:02:11 crc kubenswrapper[4865]: I1205 06:02:11.049863 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.048841 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.049561 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.049620 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.050362 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0fa076b7876af986bac2b8667cbf6d275c93b22d832f82ad2c83ef7e91ad5c2a"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.050430 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://0fa076b7876af986bac2b8667cbf6d275c93b22d832f82ad2c83ef7e91ad5c2a" gracePeriod=600 Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.605663 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="0fa076b7876af986bac2b8667cbf6d275c93b22d832f82ad2c83ef7e91ad5c2a" exitCode=0 Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.605733 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"0fa076b7876af986bac2b8667cbf6d275c93b22d832f82ad2c83ef7e91ad5c2a"} Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.606102 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"80e4ab6bf8f2776e8ac270a6781a82ac7a7696de67acef027bfa81b854301141"} Dec 05 06:02:41 crc kubenswrapper[4865]: I1205 06:02:41.606129 4865 scope.go:117] "RemoveContainer" containerID="c37cd466671a814dc7fd213e210192f3341c2133e9a2d5a7ced242665a144318" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.456149 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hs4kn"] Dec 05 06:03:59 crc kubenswrapper[4865]: E1205 06:03:59.457001 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1384ef4-f086-4d99-92af-ed79b1e25ac8" containerName="collect-profiles" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.457015 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1384ef4-f086-4d99-92af-ed79b1e25ac8" containerName="collect-profiles" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.457132 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1384ef4-f086-4d99-92af-ed79b1e25ac8" containerName="collect-profiles" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.457602 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.460393 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.460765 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.474894 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hs4kn"] Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.479288 4865 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xdqsq" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.520064 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-8fhmd"] Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.521048 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-8fhmd" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.523298 4865 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ttlvn" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.535854 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nkfts"] Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.536787 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.539098 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-8fhmd"] Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.541033 4865 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ncm7j" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.560865 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r94vn\" (UniqueName: \"kubernetes.io/projected/b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae-kube-api-access-r94vn\") pod \"cert-manager-cainjector-7f985d654d-hs4kn\" (UID: \"b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.561214 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrbx\" (UniqueName: \"kubernetes.io/projected/1ee0a305-a19d-4053-995b-e30a57c8cc07-kube-api-access-vvrbx\") pod \"cert-manager-5b446d88c5-8fhmd\" (UID: \"1ee0a305-a19d-4053-995b-e30a57c8cc07\") " pod="cert-manager/cert-manager-5b446d88c5-8fhmd" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.561601 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2p4\" (UniqueName: \"kubernetes.io/projected/5d3a98df-9953-49ab-a722-f37837073178-kube-api-access-kl2p4\") pod \"cert-manager-webhook-5655c58dd6-nkfts\" (UID: \"5d3a98df-9953-49ab-a722-f37837073178\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.578248 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nkfts"] Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.666918 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrbx\" (UniqueName: \"kubernetes.io/projected/1ee0a305-a19d-4053-995b-e30a57c8cc07-kube-api-access-vvrbx\") pod \"cert-manager-5b446d88c5-8fhmd\" (UID: \"1ee0a305-a19d-4053-995b-e30a57c8cc07\") " pod="cert-manager/cert-manager-5b446d88c5-8fhmd" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.667018 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2p4\" (UniqueName: \"kubernetes.io/projected/5d3a98df-9953-49ab-a722-f37837073178-kube-api-access-kl2p4\") pod \"cert-manager-webhook-5655c58dd6-nkfts\" (UID: \"5d3a98df-9953-49ab-a722-f37837073178\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.667077 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r94vn\" (UniqueName: \"kubernetes.io/projected/b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae-kube-api-access-r94vn\") pod \"cert-manager-cainjector-7f985d654d-hs4kn\" (UID: \"b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.690324 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2p4\" (UniqueName: \"kubernetes.io/projected/5d3a98df-9953-49ab-a722-f37837073178-kube-api-access-kl2p4\") pod \"cert-manager-webhook-5655c58dd6-nkfts\" (UID: \"5d3a98df-9953-49ab-a722-f37837073178\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.699142 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrbx\" (UniqueName: \"kubernetes.io/projected/1ee0a305-a19d-4053-995b-e30a57c8cc07-kube-api-access-vvrbx\") pod \"cert-manager-5b446d88c5-8fhmd\" (UID: \"1ee0a305-a19d-4053-995b-e30a57c8cc07\") " pod="cert-manager/cert-manager-5b446d88c5-8fhmd" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.703040 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r94vn\" (UniqueName: \"kubernetes.io/projected/b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae-kube-api-access-r94vn\") pod \"cert-manager-cainjector-7f985d654d-hs4kn\" (UID: \"b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.772678 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.836768 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-8fhmd" Dec 05 06:03:59 crc kubenswrapper[4865]: I1205 06:03:59.859996 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:04:00 crc kubenswrapper[4865]: I1205 06:04:00.065491 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hs4kn"] Dec 05 06:04:00 crc kubenswrapper[4865]: I1205 06:04:00.075402 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:04:00 crc kubenswrapper[4865]: I1205 06:04:00.162029 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-8fhmd"] Dec 05 06:04:00 crc kubenswrapper[4865]: W1205 06:04:00.168070 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee0a305_a19d_4053_995b_e30a57c8cc07.slice/crio-7b72b18ddedcef0f5547bf3cd78ab8077e9b9b2a8c89fbacbf4464ee8c7299df WatchSource:0}: Error finding container 7b72b18ddedcef0f5547bf3cd78ab8077e9b9b2a8c89fbacbf4464ee8c7299df: Status 404 returned error can't find the container with id 7b72b18ddedcef0f5547bf3cd78ab8077e9b9b2a8c89fbacbf4464ee8c7299df Dec 05 06:04:00 crc kubenswrapper[4865]: I1205 06:04:00.220966 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" event={"ID":"b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae","Type":"ContainerStarted","Data":"9100d9d776d05f42caa12f41c97553b306cee0396dae7cde618d6adade26180c"} Dec 05 06:04:00 crc kubenswrapper[4865]: I1205 06:04:00.225896 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-8fhmd" event={"ID":"1ee0a305-a19d-4053-995b-e30a57c8cc07","Type":"ContainerStarted","Data":"7b72b18ddedcef0f5547bf3cd78ab8077e9b9b2a8c89fbacbf4464ee8c7299df"} Dec 05 06:04:00 crc kubenswrapper[4865]: I1205 06:04:00.234200 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nkfts"] Dec 05 06:04:00 crc kubenswrapper[4865]: W1205 06:04:00.241307 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d3a98df_9953_49ab_a722_f37837073178.slice/crio-06d106eb3355174ce92174485128e6cdd750cf418804ac1c4bf15fa260e82e95 WatchSource:0}: Error finding container 06d106eb3355174ce92174485128e6cdd750cf418804ac1c4bf15fa260e82e95: Status 404 returned error can't find the container with id 06d106eb3355174ce92174485128e6cdd750cf418804ac1c4bf15fa260e82e95 Dec 05 06:04:01 crc kubenswrapper[4865]: I1205 06:04:01.233765 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" event={"ID":"5d3a98df-9953-49ab-a722-f37837073178","Type":"ContainerStarted","Data":"06d106eb3355174ce92174485128e6cdd750cf418804ac1c4bf15fa260e82e95"} Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.256102 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-8fhmd" event={"ID":"1ee0a305-a19d-4053-995b-e30a57c8cc07","Type":"ContainerStarted","Data":"3f971ca6b06f8a8dd53fd09fbfd4c20cfbdc094f2b7c2d1bcbbe004125d9ffd4"} Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.259966 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" event={"ID":"b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae","Type":"ContainerStarted","Data":"020c44546dd1cfd93a6a1653b48c6b85bb87386d380dc9ff79f45a8dc27d7cb3"} Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.262278 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" event={"ID":"5d3a98df-9953-49ab-a722-f37837073178","Type":"ContainerStarted","Data":"d5332376674f871718d951d242c1310d86d06d5457ecf18987f42b22ccbbfed5"} Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.262517 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.285436 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-8fhmd" podStartSLOduration=1.844627489 podStartE2EDuration="5.285415179s" podCreationTimestamp="2025-12-05 06:03:59 +0000 UTC" firstStartedPulling="2025-12-05 06:04:00.171938403 +0000 UTC m=+659.451949625" lastFinishedPulling="2025-12-05 06:04:03.612726053 +0000 UTC m=+662.892737315" observedRunningTime="2025-12-05 06:04:04.281976239 +0000 UTC m=+663.561987501" watchObservedRunningTime="2025-12-05 06:04:04.285415179 +0000 UTC m=+663.565426401" Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.323439 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" podStartSLOduration=2.01685345 podStartE2EDuration="5.323415457s" podCreationTimestamp="2025-12-05 06:03:59 +0000 UTC" firstStartedPulling="2025-12-05 06:04:00.24383426 +0000 UTC m=+659.523845482" lastFinishedPulling="2025-12-05 06:04:03.550396267 +0000 UTC m=+662.830407489" observedRunningTime="2025-12-05 06:04:04.319532055 +0000 UTC m=+663.599543287" watchObservedRunningTime="2025-12-05 06:04:04.323415457 +0000 UTC m=+663.603426669" Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.348238 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-hs4kn" podStartSLOduration=1.874197155 podStartE2EDuration="5.348208827s" podCreationTimestamp="2025-12-05 06:03:59 +0000 UTC" firstStartedPulling="2025-12-05 06:04:00.075207474 +0000 UTC m=+659.355218696" lastFinishedPulling="2025-12-05 06:04:03.549219146 +0000 UTC m=+662.829230368" observedRunningTime="2025-12-05 06:04:04.341216614 +0000 UTC m=+663.621227876" watchObservedRunningTime="2025-12-05 06:04:04.348208827 +0000 UTC m=+663.628220049" Dec 05 06:04:04 crc kubenswrapper[4865]: I1205 06:04:04.809287 4865 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 06:04:09 crc kubenswrapper[4865]: I1205 06:04:09.869589 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.851926 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g5k4k"] Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.853092 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-controller" containerID="cri-o://6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.853208 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-node" containerID="cri-o://67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.853224 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="nbdb" containerID="cri-o://6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.853252 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-acl-logging" containerID="cri-o://2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.853327 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="northd" containerID="cri-o://b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.853474 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="sbdb" containerID="cri-o://df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.854201 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" gracePeriod=30 Dec 05 06:04:28 crc kubenswrapper[4865]: I1205 06:04:28.930671 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovnkube-controller" containerID="cri-o://6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" gracePeriod=30 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.207859 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5k4k_e740ad4f-4c03-467b-8f0f-4fec2493d426/ovn-acl-logging/0.log" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.210419 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5k4k_e740ad4f-4c03-467b-8f0f-4fec2493d426/ovn-controller/0.log" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.211919 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290034 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzl8h"] Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290381 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kubecfg-setup" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290412 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kubecfg-setup" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290433 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290446 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290470 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-controller" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290484 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-controller" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290513 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovnkube-controller" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290526 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovnkube-controller" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290573 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="sbdb" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290590 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="sbdb" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290617 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="northd" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290634 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="northd" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290662 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-acl-logging" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290679 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-acl-logging" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290707 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="nbdb" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290724 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="nbdb" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.290741 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-node" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.290758 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-node" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291413 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291468 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="northd" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291497 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-controller" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291525 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovn-acl-logging" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291547 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="sbdb" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291570 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="kube-rbac-proxy-node" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291595 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="ovnkube-controller" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.291618 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerName="nbdb" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.295212 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346027 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-script-lib\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346097 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx4n2\" (UniqueName: \"kubernetes.io/projected/e740ad4f-4c03-467b-8f0f-4fec2493d426-kube-api-access-zx4n2\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346140 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-systemd\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346171 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-kubelet\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346207 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-systemd-units\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346245 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346278 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-ovn-kubernetes\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346302 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-openvswitch\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346275 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346337 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-netd\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346358 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-netns\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346382 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-bin\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346385 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346416 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346425 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-config\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346441 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346460 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovn-node-metrics-cert\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346471 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346482 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-node-log\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346494 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346515 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346555 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-log-socket\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346581 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-slash\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346604 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-env-overrides\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346625 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-ovn\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346651 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-etc-openvswitch\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346674 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-var-lib-openvswitch\") pod \"e740ad4f-4c03-467b-8f0f-4fec2493d426\" (UID: \"e740ad4f-4c03-467b-8f0f-4fec2493d426\") " Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.346984 4865 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347017 4865 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347033 4865 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347044 4865 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347058 4865 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347070 4865 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347087 4865 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347069 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347099 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347351 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347392 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-node-log" (OuterVolumeSpecName: "node-log") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347369 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347422 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347441 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-log-socket" (OuterVolumeSpecName: "log-socket") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347455 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-slash" (OuterVolumeSpecName: "host-slash") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347472 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.347753 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.364522 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.364632 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e740ad4f-4c03-467b-8f0f-4fec2493d426-kube-api-access-zx4n2" (OuterVolumeSpecName: "kube-api-access-zx4n2") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "kube-api-access-zx4n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.373397 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e740ad4f-4c03-467b-8f0f-4fec2493d426" (UID: "e740ad4f-4c03-467b-8f0f-4fec2493d426"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.439349 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5k4k_e740ad4f-4c03-467b-8f0f-4fec2493d426/ovn-acl-logging/0.log" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.439858 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5k4k_e740ad4f-4c03-467b-8f0f-4fec2493d426/ovn-controller/0.log" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440398 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" exitCode=0 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440444 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" exitCode=0 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440459 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" exitCode=0 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440468 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440491 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440537 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440554 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440569 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440593 4865 scope.go:117] "RemoveContainer" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440472 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" exitCode=0 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.440999 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" exitCode=0 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441015 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" exitCode=0 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441032 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" exitCode=143 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441044 4865 generic.go:334] "Generic (PLEG): container finished" podID="e740ad4f-4c03-467b-8f0f-4fec2493d426" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" exitCode=143 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441107 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441128 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441143 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441157 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441165 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441175 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441185 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441195 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441202 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441210 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441217 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441225 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441233 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441241 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441248 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441258 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441269 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441278 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441285 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441292 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441300 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441307 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441314 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441321 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441328 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441337 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5k4k" event={"ID":"e740ad4f-4c03-467b-8f0f-4fec2493d426","Type":"ContainerDied","Data":"25d34272017cc6e15cc9923323dccd32c76235fdd2d1a02011dea3e46571dcfa"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441348 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441357 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441364 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441371 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441379 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441386 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441393 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441399 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.441406 4865 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.447942 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-slash\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.447989 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-ovnkube-config\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448025 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-kubelet\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448048 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-node-log\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448080 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448162 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-ovnkube-script-lib\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448378 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-log-socket\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448476 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-env-overrides\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448548 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-cni-netd\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448638 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-cni-bin\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.448762 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb88223f-cdf6-4222-af60-fd62638c4efa-ovn-node-metrics-cert\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.449537 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-systemd-units\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.449759 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-ovn\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.449944 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg8b\" (UniqueName: \"kubernetes.io/projected/bb88223f-cdf6-4222-af60-fd62638c4efa-kube-api-access-xfg8b\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.450014 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-run-netns\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.450129 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-systemd\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.450982 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-var-lib-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451100 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-etc-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451223 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451296 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451424 4865 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451451 4865 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451467 4865 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451509 4865 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451527 4865 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451541 4865 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451558 4865 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451598 4865 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451617 4865 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451631 4865 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451657 4865 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e740ad4f-4c03-467b-8f0f-4fec2493d426-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451672 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx4n2\" (UniqueName: \"kubernetes.io/projected/e740ad4f-4c03-467b-8f0f-4fec2493d426-kube-api-access-zx4n2\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.451687 4865 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e740ad4f-4c03-467b-8f0f-4fec2493d426-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.452717 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bpkm9_2d1a82bf-1dc7-48e4-b2e2-32514537aae7/kube-multus/0.log" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.452808 4865 generic.go:334] "Generic (PLEG): container finished" podID="2d1a82bf-1dc7-48e4-b2e2-32514537aae7" containerID="26b35f7cbeb5d671a1296bf1f0c1cd2685ea23b78ab0270bffa3b7e547e85b98" exitCode=2 Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.452965 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bpkm9" event={"ID":"2d1a82bf-1dc7-48e4-b2e2-32514537aae7","Type":"ContainerDied","Data":"26b35f7cbeb5d671a1296bf1f0c1cd2685ea23b78ab0270bffa3b7e547e85b98"} Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.453685 4865 scope.go:117] "RemoveContainer" containerID="26b35f7cbeb5d671a1296bf1f0c1cd2685ea23b78ab0270bffa3b7e547e85b98" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.472465 4865 scope.go:117] "RemoveContainer" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.500751 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g5k4k"] Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.505788 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g5k4k"] Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.512080 4865 scope.go:117] "RemoveContainer" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.539353 4865 scope.go:117] "RemoveContainer" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.553467 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-etc-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.553532 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.553598 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-etc-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.553774 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.553888 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.553970 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.554109 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-slash\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.554188 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-slash\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.554294 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-ovnkube-config\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555166 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-ovnkube-config\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555181 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-kubelet\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555256 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-kubelet\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555295 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-node-log\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555325 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555370 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-node-log\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555377 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-ovnkube-script-lib\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.556233 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-log-socket\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.556357 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-env-overrides\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.556164 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-ovnkube-script-lib\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.556304 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-log-socket\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.556944 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bb88223f-cdf6-4222-af60-fd62638c4efa-env-overrides\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.556999 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-cni-netd\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.555449 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.557084 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-cni-bin\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.557165 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-cni-netd\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.557276 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-cni-bin\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.557334 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb88223f-cdf6-4222-af60-fd62638c4efa-ovn-node-metrics-cert\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559425 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-systemd-units\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559503 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-systemd-units\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559590 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-ovn\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559627 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg8b\" (UniqueName: \"kubernetes.io/projected/bb88223f-cdf6-4222-af60-fd62638c4efa-kube-api-access-xfg8b\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559659 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-run-netns\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559688 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-systemd\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559719 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-var-lib-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559789 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-var-lib-openvswitch\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.559845 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-ovn\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.560109 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-run-systemd\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.560931 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb88223f-cdf6-4222-af60-fd62638c4efa-host-run-netns\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.562884 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb88223f-cdf6-4222-af60-fd62638c4efa-ovn-node-metrics-cert\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.573396 4865 scope.go:117] "RemoveContainer" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.589735 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg8b\" (UniqueName: \"kubernetes.io/projected/bb88223f-cdf6-4222-af60-fd62638c4efa-kube-api-access-xfg8b\") pod \"ovnkube-node-mzl8h\" (UID: \"bb88223f-cdf6-4222-af60-fd62638c4efa\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.595504 4865 scope.go:117] "RemoveContainer" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.610978 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.614108 4865 scope.go:117] "RemoveContainer" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.644943 4865 scope.go:117] "RemoveContainer" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.670271 4865 scope.go:117] "RemoveContainer" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.689526 4865 scope.go:117] "RemoveContainer" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.690264 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": container with ID starting with 6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661 not found: ID does not exist" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.690305 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} err="failed to get container status \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": rpc error: code = NotFound desc = could not find container \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": container with ID starting with 6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.690330 4865 scope.go:117] "RemoveContainer" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.690913 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": container with ID starting with df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f not found: ID does not exist" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.690940 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} err="failed to get container status \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": rpc error: code = NotFound desc = could not find container \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": container with ID starting with df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.690960 4865 scope.go:117] "RemoveContainer" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.691241 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": container with ID starting with 6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85 not found: ID does not exist" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.691267 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} err="failed to get container status \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": rpc error: code = NotFound desc = could not find container \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": container with ID starting with 6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.691284 4865 scope.go:117] "RemoveContainer" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.691555 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": container with ID starting with b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90 not found: ID does not exist" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.691586 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} err="failed to get container status \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": rpc error: code = NotFound desc = could not find container \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": container with ID starting with b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.691604 4865 scope.go:117] "RemoveContainer" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.692016 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": container with ID starting with 985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7 not found: ID does not exist" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.692041 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} err="failed to get container status \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": rpc error: code = NotFound desc = could not find container \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": container with ID starting with 985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.692057 4865 scope.go:117] "RemoveContainer" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.692460 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": container with ID starting with 67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270 not found: ID does not exist" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.692487 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} err="failed to get container status \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": rpc error: code = NotFound desc = could not find container \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": container with ID starting with 67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.692504 4865 scope.go:117] "RemoveContainer" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.692835 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": container with ID starting with 2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee not found: ID does not exist" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.692865 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} err="failed to get container status \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": rpc error: code = NotFound desc = could not find container \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": container with ID starting with 2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.692884 4865 scope.go:117] "RemoveContainer" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.693115 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": container with ID starting with 6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75 not found: ID does not exist" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.693140 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} err="failed to get container status \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": rpc error: code = NotFound desc = could not find container \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": container with ID starting with 6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.693159 4865 scope.go:117] "RemoveContainer" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" Dec 05 06:04:30 crc kubenswrapper[4865]: E1205 06:04:30.693622 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": container with ID starting with 26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66 not found: ID does not exist" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.693650 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} err="failed to get container status \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": rpc error: code = NotFound desc = could not find container \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": container with ID starting with 26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.693672 4865 scope.go:117] "RemoveContainer" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.693937 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} err="failed to get container status \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": rpc error: code = NotFound desc = could not find container \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": container with ID starting with 6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.693957 4865 scope.go:117] "RemoveContainer" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.694301 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} err="failed to get container status \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": rpc error: code = NotFound desc = could not find container \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": container with ID starting with df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.694322 4865 scope.go:117] "RemoveContainer" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.694573 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} err="failed to get container status \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": rpc error: code = NotFound desc = could not find container \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": container with ID starting with 6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.694593 4865 scope.go:117] "RemoveContainer" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.694843 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} err="failed to get container status \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": rpc error: code = NotFound desc = could not find container \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": container with ID starting with b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.694869 4865 scope.go:117] "RemoveContainer" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695127 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} err="failed to get container status \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": rpc error: code = NotFound desc = could not find container \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": container with ID starting with 985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695149 4865 scope.go:117] "RemoveContainer" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695349 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} err="failed to get container status \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": rpc error: code = NotFound desc = could not find container \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": container with ID starting with 67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695370 4865 scope.go:117] "RemoveContainer" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695614 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} err="failed to get container status \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": rpc error: code = NotFound desc = could not find container \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": container with ID starting with 2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695634 4865 scope.go:117] "RemoveContainer" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695854 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} err="failed to get container status \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": rpc error: code = NotFound desc = could not find container \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": container with ID starting with 6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.695877 4865 scope.go:117] "RemoveContainer" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.696048 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} err="failed to get container status \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": rpc error: code = NotFound desc = could not find container \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": container with ID starting with 26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.696082 4865 scope.go:117] "RemoveContainer" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.696323 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} err="failed to get container status \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": rpc error: code = NotFound desc = could not find container \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": container with ID starting with 6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.696345 4865 scope.go:117] "RemoveContainer" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.696580 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} err="failed to get container status \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": rpc error: code = NotFound desc = could not find container \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": container with ID starting with df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.696602 4865 scope.go:117] "RemoveContainer" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.697067 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} err="failed to get container status \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": rpc error: code = NotFound desc = could not find container \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": container with ID starting with 6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.697089 4865 scope.go:117] "RemoveContainer" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.697374 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} err="failed to get container status \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": rpc error: code = NotFound desc = could not find container \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": container with ID starting with b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.697395 4865 scope.go:117] "RemoveContainer" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.697814 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} err="failed to get container status \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": rpc error: code = NotFound desc = could not find container \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": container with ID starting with 985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.697858 4865 scope.go:117] "RemoveContainer" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.698187 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} err="failed to get container status \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": rpc error: code = NotFound desc = could not find container \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": container with ID starting with 67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.698212 4865 scope.go:117] "RemoveContainer" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.698446 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} err="failed to get container status \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": rpc error: code = NotFound desc = could not find container \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": container with ID starting with 2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.698471 4865 scope.go:117] "RemoveContainer" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.705163 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} err="failed to get container status \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": rpc error: code = NotFound desc = could not find container \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": container with ID starting with 6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.705244 4865 scope.go:117] "RemoveContainer" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.712106 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} err="failed to get container status \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": rpc error: code = NotFound desc = could not find container \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": container with ID starting with 26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.712174 4865 scope.go:117] "RemoveContainer" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.713529 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} err="failed to get container status \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": rpc error: code = NotFound desc = could not find container \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": container with ID starting with 6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.713572 4865 scope.go:117] "RemoveContainer" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.714070 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} err="failed to get container status \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": rpc error: code = NotFound desc = could not find container \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": container with ID starting with df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.714186 4865 scope.go:117] "RemoveContainer" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.714730 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} err="failed to get container status \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": rpc error: code = NotFound desc = could not find container \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": container with ID starting with 6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.714791 4865 scope.go:117] "RemoveContainer" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.715277 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} err="failed to get container status \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": rpc error: code = NotFound desc = could not find container \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": container with ID starting with b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.715320 4865 scope.go:117] "RemoveContainer" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.715641 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} err="failed to get container status \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": rpc error: code = NotFound desc = could not find container \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": container with ID starting with 985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.715673 4865 scope.go:117] "RemoveContainer" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.716052 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} err="failed to get container status \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": rpc error: code = NotFound desc = could not find container \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": container with ID starting with 67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.716079 4865 scope.go:117] "RemoveContainer" containerID="2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.716361 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee"} err="failed to get container status \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": rpc error: code = NotFound desc = could not find container \"2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee\": container with ID starting with 2cb4f62496949d99f81d8c1c8e2d27dc441ab5395f92c1a1148940c03fd233ee not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.716401 4865 scope.go:117] "RemoveContainer" containerID="6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.716791 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75"} err="failed to get container status \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": rpc error: code = NotFound desc = could not find container \"6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75\": container with ID starting with 6bffee05d9b6947c45071c1b710a7e289140d6542ddcf23774b3445a37528e75 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.716852 4865 scope.go:117] "RemoveContainer" containerID="26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.717122 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66"} err="failed to get container status \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": rpc error: code = NotFound desc = could not find container \"26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66\": container with ID starting with 26ee9bdb7fbb3a0b362e0340f2e949d3b2ccdbc6d06bdd75f02316b72934cb66 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.717150 4865 scope.go:117] "RemoveContainer" containerID="6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.717452 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661"} err="failed to get container status \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": rpc error: code = NotFound desc = could not find container \"6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661\": container with ID starting with 6e4e72dc5cb29225de9d5256288a356c05d9b3eebcc60697c052d60c642a7661 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.717478 4865 scope.go:117] "RemoveContainer" containerID="df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.717917 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f"} err="failed to get container status \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": rpc error: code = NotFound desc = could not find container \"df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f\": container with ID starting with df96512294447c7519a67171d8617e6cb2eda2c9092c7674d864c1e18268907f not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.717942 4865 scope.go:117] "RemoveContainer" containerID="6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.718257 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85"} err="failed to get container status \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": rpc error: code = NotFound desc = could not find container \"6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85\": container with ID starting with 6f0e3ed25def8ca0eb1d5937756af52b38718e2033ed13411ab3dfa825a4db85 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.718288 4865 scope.go:117] "RemoveContainer" containerID="b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.718590 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90"} err="failed to get container status \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": rpc error: code = NotFound desc = could not find container \"b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90\": container with ID starting with b32d88273d84688d337a794d7ad740a983287caa9acd72aa197df71efea1fe90 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.718623 4865 scope.go:117] "RemoveContainer" containerID="985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.719119 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7"} err="failed to get container status \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": rpc error: code = NotFound desc = could not find container \"985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7\": container with ID starting with 985c0bb1fd904ba285368b57ea6a9346cafd5dd706fd9669ff695ef977c873e7 not found: ID does not exist" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.719142 4865 scope.go:117] "RemoveContainer" containerID="67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270" Dec 05 06:04:30 crc kubenswrapper[4865]: I1205 06:04:30.719525 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270"} err="failed to get container status \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": rpc error: code = NotFound desc = could not find container \"67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270\": container with ID starting with 67dce5bcb401e5238a52ac0023901b47687fe6c421fbc6dc8ef36fb3308e3270 not found: ID does not exist" Dec 05 06:04:31 crc kubenswrapper[4865]: I1205 06:04:31.012757 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e740ad4f-4c03-467b-8f0f-4fec2493d426" path="/var/lib/kubelet/pods/e740ad4f-4c03-467b-8f0f-4fec2493d426/volumes" Dec 05 06:04:31 crc kubenswrapper[4865]: I1205 06:04:31.463064 4865 generic.go:334] "Generic (PLEG): container finished" podID="bb88223f-cdf6-4222-af60-fd62638c4efa" containerID="32ac64da3e8ac154dfe388a62aafb409fa37c8973b964f5d5356dcad35a15924" exitCode=0 Dec 05 06:04:31 crc kubenswrapper[4865]: I1205 06:04:31.463485 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerDied","Data":"32ac64da3e8ac154dfe388a62aafb409fa37c8973b964f5d5356dcad35a15924"} Dec 05 06:04:31 crc kubenswrapper[4865]: I1205 06:04:31.463530 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"b370fb3e45c7f645e4441a310d6f6a68fbe5c22b2495fd966c6f016f077a073b"} Dec 05 06:04:31 crc kubenswrapper[4865]: I1205 06:04:31.468167 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bpkm9_2d1a82bf-1dc7-48e4-b2e2-32514537aae7/kube-multus/0.log" Dec 05 06:04:31 crc kubenswrapper[4865]: I1205 06:04:31.468441 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bpkm9" event={"ID":"2d1a82bf-1dc7-48e4-b2e2-32514537aae7","Type":"ContainerStarted","Data":"c041e8f7cb3a2d9243f934a185a0cda14fc48313df27797cecf86ee90ac29b71"} Dec 05 06:04:32 crc kubenswrapper[4865]: I1205 06:04:32.480485 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"42fe9f21509037a83abcbd9932bd8c7bac7065413a4a9483d3d3788b4084ffb1"} Dec 05 06:04:32 crc kubenswrapper[4865]: I1205 06:04:32.481074 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"3da37d53e2198322d710ad28b9951dc52a30f26fac56e481ce4fbab6d94409f9"} Dec 05 06:04:32 crc kubenswrapper[4865]: I1205 06:04:32.481113 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"cd5665f1355e504f0be347069f2c8f11251f432f91c5f68cdec328b4350fbda7"} Dec 05 06:04:32 crc kubenswrapper[4865]: I1205 06:04:32.481143 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"ea74aaeffb0f1461d573746af65d38cc1ba463c928dc2ba2bf39d01df1c51867"} Dec 05 06:04:32 crc kubenswrapper[4865]: I1205 06:04:32.481166 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"554f5c51aff482f6965d2085cd148db880cca47d14d436e50d10cc72e64c1bf0"} Dec 05 06:04:32 crc kubenswrapper[4865]: I1205 06:04:32.481188 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"74202c26d63eb125bee5352e41d2be307f96249627c2ba360808cf1e699c8036"} Dec 05 06:04:34 crc kubenswrapper[4865]: I1205 06:04:34.502696 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"cc09a6731cc7fd641c1e8e8dc505fb0a72eb17f5c28dd0f70ec44ebe0afae984"} Dec 05 06:04:37 crc kubenswrapper[4865]: I1205 06:04:37.524772 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" event={"ID":"bb88223f-cdf6-4222-af60-fd62638c4efa","Type":"ContainerStarted","Data":"517e1a3aaff041f9c29ea55b3b295232c5afc25d37d42a3a9254c9f452ae4f47"} Dec 05 06:04:37 crc kubenswrapper[4865]: I1205 06:04:37.525272 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:37 crc kubenswrapper[4865]: I1205 06:04:37.525288 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:37 crc kubenswrapper[4865]: I1205 06:04:37.576224 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" podStartSLOduration=7.576202857 podStartE2EDuration="7.576202857s" podCreationTimestamp="2025-12-05 06:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:04:37.575080998 +0000 UTC m=+696.855092230" watchObservedRunningTime="2025-12-05 06:04:37.576202857 +0000 UTC m=+696.856214079" Dec 05 06:04:37 crc kubenswrapper[4865]: I1205 06:04:37.647013 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:38 crc kubenswrapper[4865]: I1205 06:04:38.531250 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:38 crc kubenswrapper[4865]: I1205 06:04:38.566859 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:04:41 crc kubenswrapper[4865]: I1205 06:04:41.048772 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:04:41 crc kubenswrapper[4865]: I1205 06:04:41.049701 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.687373 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t"] Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.689898 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.691754 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.703464 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t"] Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.770660 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.770746 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.770932 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx4sm\" (UniqueName: \"kubernetes.io/projected/9466efcc-eb07-4316-a188-5b18e8108180-kube-api-access-fx4sm\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.872480 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx4sm\" (UniqueName: \"kubernetes.io/projected/9466efcc-eb07-4316-a188-5b18e8108180-kube-api-access-fx4sm\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.872677 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.872749 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.873704 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.873737 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:50 crc kubenswrapper[4865]: I1205 06:04:50.905760 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx4sm\" (UniqueName: \"kubernetes.io/projected/9466efcc-eb07-4316-a188-5b18e8108180-kube-api-access-fx4sm\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:51 crc kubenswrapper[4865]: I1205 06:04:51.007676 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:51 crc kubenswrapper[4865]: I1205 06:04:51.291467 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t"] Dec 05 06:04:51 crc kubenswrapper[4865]: W1205 06:04:51.302054 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9466efcc_eb07_4316_a188_5b18e8108180.slice/crio-ff27e216d4eb063f763160cac9b759a503a5d4a4b473ca24bddfd4f37eda2c49 WatchSource:0}: Error finding container ff27e216d4eb063f763160cac9b759a503a5d4a4b473ca24bddfd4f37eda2c49: Status 404 returned error can't find the container with id ff27e216d4eb063f763160cac9b759a503a5d4a4b473ca24bddfd4f37eda2c49 Dec 05 06:04:51 crc kubenswrapper[4865]: I1205 06:04:51.640075 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" event={"ID":"9466efcc-eb07-4316-a188-5b18e8108180","Type":"ContainerStarted","Data":"1d636afbb6052a5d3225d40a110d0309cdf44cfce44e04d046527b59e11b6641"} Dec 05 06:04:51 crc kubenswrapper[4865]: I1205 06:04:51.640139 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" event={"ID":"9466efcc-eb07-4316-a188-5b18e8108180","Type":"ContainerStarted","Data":"ff27e216d4eb063f763160cac9b759a503a5d4a4b473ca24bddfd4f37eda2c49"} Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.549384 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4p25c"] Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.551251 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.562039 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4p25c"] Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.648736 4865 generic.go:334] "Generic (PLEG): container finished" podID="9466efcc-eb07-4316-a188-5b18e8108180" containerID="1d636afbb6052a5d3225d40a110d0309cdf44cfce44e04d046527b59e11b6641" exitCode=0 Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.648786 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" event={"ID":"9466efcc-eb07-4316-a188-5b18e8108180","Type":"ContainerDied","Data":"1d636afbb6052a5d3225d40a110d0309cdf44cfce44e04d046527b59e11b6641"} Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.703615 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfg2f\" (UniqueName: \"kubernetes.io/projected/3c6f6447-4a73-4a61-b844-8b9244b22637-kube-api-access-dfg2f\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.703698 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-catalog-content\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.704011 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-utilities\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.804461 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-utilities\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.804558 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfg2f\" (UniqueName: \"kubernetes.io/projected/3c6f6447-4a73-4a61-b844-8b9244b22637-kube-api-access-dfg2f\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.804582 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-catalog-content\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.805085 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-utilities\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.805110 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-catalog-content\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.829302 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfg2f\" (UniqueName: \"kubernetes.io/projected/3c6f6447-4a73-4a61-b844-8b9244b22637-kube-api-access-dfg2f\") pod \"redhat-operators-4p25c\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:52 crc kubenswrapper[4865]: I1205 06:04:52.909655 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:04:53 crc kubenswrapper[4865]: I1205 06:04:53.121546 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4p25c"] Dec 05 06:04:53 crc kubenswrapper[4865]: W1205 06:04:53.124658 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c6f6447_4a73_4a61_b844_8b9244b22637.slice/crio-0485677c84a097a6b43f730d23a77487a470e5097a4639df9928d00df2d5f134 WatchSource:0}: Error finding container 0485677c84a097a6b43f730d23a77487a470e5097a4639df9928d00df2d5f134: Status 404 returned error can't find the container with id 0485677c84a097a6b43f730d23a77487a470e5097a4639df9928d00df2d5f134 Dec 05 06:04:53 crc kubenswrapper[4865]: I1205 06:04:53.656787 4865 generic.go:334] "Generic (PLEG): container finished" podID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerID="21afaed35bb1ca688aee904fc63eb79165aff9edba6a2383a5c6cad55ef4dbb2" exitCode=0 Dec 05 06:04:53 crc kubenswrapper[4865]: I1205 06:04:53.656866 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerDied","Data":"21afaed35bb1ca688aee904fc63eb79165aff9edba6a2383a5c6cad55ef4dbb2"} Dec 05 06:04:53 crc kubenswrapper[4865]: I1205 06:04:53.656897 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerStarted","Data":"0485677c84a097a6b43f730d23a77487a470e5097a4639df9928d00df2d5f134"} Dec 05 06:04:54 crc kubenswrapper[4865]: I1205 06:04:54.666911 4865 generic.go:334] "Generic (PLEG): container finished" podID="9466efcc-eb07-4316-a188-5b18e8108180" containerID="d0ac6f86bf174364cea07f7c99f75f6ee5551e758a4b76f229f0c0f0279279c6" exitCode=0 Dec 05 06:04:54 crc kubenswrapper[4865]: I1205 06:04:54.667008 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" event={"ID":"9466efcc-eb07-4316-a188-5b18e8108180","Type":"ContainerDied","Data":"d0ac6f86bf174364cea07f7c99f75f6ee5551e758a4b76f229f0c0f0279279c6"} Dec 05 06:04:54 crc kubenswrapper[4865]: I1205 06:04:54.671428 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerStarted","Data":"8db9159f6495c8c4778543a55247fd86dcf60f6436820539c2edbda25e4f9649"} Dec 05 06:04:55 crc kubenswrapper[4865]: I1205 06:04:55.689751 4865 generic.go:334] "Generic (PLEG): container finished" podID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerID="8db9159f6495c8c4778543a55247fd86dcf60f6436820539c2edbda25e4f9649" exitCode=0 Dec 05 06:04:55 crc kubenswrapper[4865]: I1205 06:04:55.689871 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerDied","Data":"8db9159f6495c8c4778543a55247fd86dcf60f6436820539c2edbda25e4f9649"} Dec 05 06:04:55 crc kubenswrapper[4865]: I1205 06:04:55.698381 4865 generic.go:334] "Generic (PLEG): container finished" podID="9466efcc-eb07-4316-a188-5b18e8108180" containerID="cad66c9375bff24b0d2e7c844692a1c95e3a753efc57963454469963a2adb396" exitCode=0 Dec 05 06:04:55 crc kubenswrapper[4865]: I1205 06:04:55.698449 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" event={"ID":"9466efcc-eb07-4316-a188-5b18e8108180","Type":"ContainerDied","Data":"cad66c9375bff24b0d2e7c844692a1c95e3a753efc57963454469963a2adb396"} Dec 05 06:04:56 crc kubenswrapper[4865]: I1205 06:04:56.709965 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerStarted","Data":"68b9a3d3a224e57798027484c2ac9b0e4e72015e088bffdd1c0d5cdd8e9942e2"} Dec 05 06:04:56 crc kubenswrapper[4865]: I1205 06:04:56.743882 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4p25c" podStartSLOduration=2.279453083 podStartE2EDuration="4.743861816s" podCreationTimestamp="2025-12-05 06:04:52 +0000 UTC" firstStartedPulling="2025-12-05 06:04:53.658602478 +0000 UTC m=+712.938613690" lastFinishedPulling="2025-12-05 06:04:56.123011161 +0000 UTC m=+715.403022423" observedRunningTime="2025-12-05 06:04:56.737422217 +0000 UTC m=+716.017433449" watchObservedRunningTime="2025-12-05 06:04:56.743861816 +0000 UTC m=+716.023873048" Dec 05 06:04:56 crc kubenswrapper[4865]: I1205 06:04:56.988097 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.077812 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-util\") pod \"9466efcc-eb07-4316-a188-5b18e8108180\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.077965 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx4sm\" (UniqueName: \"kubernetes.io/projected/9466efcc-eb07-4316-a188-5b18e8108180-kube-api-access-fx4sm\") pod \"9466efcc-eb07-4316-a188-5b18e8108180\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.078026 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-bundle\") pod \"9466efcc-eb07-4316-a188-5b18e8108180\" (UID: \"9466efcc-eb07-4316-a188-5b18e8108180\") " Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.078768 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-bundle" (OuterVolumeSpecName: "bundle") pod "9466efcc-eb07-4316-a188-5b18e8108180" (UID: "9466efcc-eb07-4316-a188-5b18e8108180"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.085121 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9466efcc-eb07-4316-a188-5b18e8108180-kube-api-access-fx4sm" (OuterVolumeSpecName: "kube-api-access-fx4sm") pod "9466efcc-eb07-4316-a188-5b18e8108180" (UID: "9466efcc-eb07-4316-a188-5b18e8108180"). InnerVolumeSpecName "kube-api-access-fx4sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.088841 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-util" (OuterVolumeSpecName: "util") pod "9466efcc-eb07-4316-a188-5b18e8108180" (UID: "9466efcc-eb07-4316-a188-5b18e8108180"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.179356 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx4sm\" (UniqueName: \"kubernetes.io/projected/9466efcc-eb07-4316-a188-5b18e8108180-kube-api-access-fx4sm\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.179404 4865 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.179419 4865 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9466efcc-eb07-4316-a188-5b18e8108180-util\") on node \"crc\" DevicePath \"\"" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.722329 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.724891 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t" event={"ID":"9466efcc-eb07-4316-a188-5b18e8108180","Type":"ContainerDied","Data":"ff27e216d4eb063f763160cac9b759a503a5d4a4b473ca24bddfd4f37eda2c49"} Dec 05 06:04:57 crc kubenswrapper[4865]: I1205 06:04:57.724918 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff27e216d4eb063f763160cac9b759a503a5d4a4b473ca24bddfd4f37eda2c49" Dec 05 06:05:00 crc kubenswrapper[4865]: I1205 06:05:00.659239 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzl8h" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.134130 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj"] Dec 05 06:05:02 crc kubenswrapper[4865]: E1205 06:05:02.134356 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="pull" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.134368 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="pull" Dec 05 06:05:02 crc kubenswrapper[4865]: E1205 06:05:02.134385 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="util" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.134391 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="util" Dec 05 06:05:02 crc kubenswrapper[4865]: E1205 06:05:02.134398 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="extract" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.134406 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="extract" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.134511 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="9466efcc-eb07-4316-a188-5b18e8108180" containerName="extract" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.134930 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.138886 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.139034 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.141286 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mzl9f" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.159429 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj"] Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.247569 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff4w5\" (UniqueName: \"kubernetes.io/projected/806823de-8d1e-48f9-964f-86cd689434c7-kube-api-access-ff4w5\") pod \"nmstate-operator-5b5b58f5c8-fdgdj\" (UID: \"806823de-8d1e-48f9-964f-86cd689434c7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.349239 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff4w5\" (UniqueName: \"kubernetes.io/projected/806823de-8d1e-48f9-964f-86cd689434c7-kube-api-access-ff4w5\") pod \"nmstate-operator-5b5b58f5c8-fdgdj\" (UID: \"806823de-8d1e-48f9-964f-86cd689434c7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.376951 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff4w5\" (UniqueName: \"kubernetes.io/projected/806823de-8d1e-48f9-964f-86cd689434c7-kube-api-access-ff4w5\") pod \"nmstate-operator-5b5b58f5c8-fdgdj\" (UID: \"806823de-8d1e-48f9-964f-86cd689434c7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.453995 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.664095 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj"] Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.758040 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" event={"ID":"806823de-8d1e-48f9-964f-86cd689434c7","Type":"ContainerStarted","Data":"b8fe6d2cdd0aeef49020a4c1dda00638193c5e743813b377ca01e5801d64174b"} Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.911103 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.911352 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:05:02 crc kubenswrapper[4865]: I1205 06:05:02.971278 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:05:03 crc kubenswrapper[4865]: I1205 06:05:03.836040 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:05:05 crc kubenswrapper[4865]: I1205 06:05:05.527952 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4p25c"] Dec 05 06:05:06 crc kubenswrapper[4865]: I1205 06:05:06.789231 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4p25c" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="registry-server" containerID="cri-o://68b9a3d3a224e57798027484c2ac9b0e4e72015e088bffdd1c0d5cdd8e9942e2" gracePeriod=2 Dec 05 06:05:08 crc kubenswrapper[4865]: I1205 06:05:08.803591 4865 generic.go:334] "Generic (PLEG): container finished" podID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerID="68b9a3d3a224e57798027484c2ac9b0e4e72015e088bffdd1c0d5cdd8e9942e2" exitCode=0 Dec 05 06:05:08 crc kubenswrapper[4865]: I1205 06:05:08.803677 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerDied","Data":"68b9a3d3a224e57798027484c2ac9b0e4e72015e088bffdd1c0d5cdd8e9942e2"} Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.177363 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.352010 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-utilities\") pod \"3c6f6447-4a73-4a61-b844-8b9244b22637\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.352182 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-catalog-content\") pod \"3c6f6447-4a73-4a61-b844-8b9244b22637\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.352234 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfg2f\" (UniqueName: \"kubernetes.io/projected/3c6f6447-4a73-4a61-b844-8b9244b22637-kube-api-access-dfg2f\") pod \"3c6f6447-4a73-4a61-b844-8b9244b22637\" (UID: \"3c6f6447-4a73-4a61-b844-8b9244b22637\") " Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.353395 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-utilities" (OuterVolumeSpecName: "utilities") pod "3c6f6447-4a73-4a61-b844-8b9244b22637" (UID: "3c6f6447-4a73-4a61-b844-8b9244b22637"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.376124 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6f6447-4a73-4a61-b844-8b9244b22637-kube-api-access-dfg2f" (OuterVolumeSpecName: "kube-api-access-dfg2f") pod "3c6f6447-4a73-4a61-b844-8b9244b22637" (UID: "3c6f6447-4a73-4a61-b844-8b9244b22637"). InnerVolumeSpecName "kube-api-access-dfg2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.453425 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfg2f\" (UniqueName: \"kubernetes.io/projected/3c6f6447-4a73-4a61-b844-8b9244b22637-kube-api-access-dfg2f\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.453465 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.504674 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c6f6447-4a73-4a61-b844-8b9244b22637" (UID: "3c6f6447-4a73-4a61-b844-8b9244b22637"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.554964 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c6f6447-4a73-4a61-b844-8b9244b22637-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.817507 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4p25c" event={"ID":"3c6f6447-4a73-4a61-b844-8b9244b22637","Type":"ContainerDied","Data":"0485677c84a097a6b43f730d23a77487a470e5097a4639df9928d00df2d5f134"} Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.817616 4865 scope.go:117] "RemoveContainer" containerID="68b9a3d3a224e57798027484c2ac9b0e4e72015e088bffdd1c0d5cdd8e9942e2" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.817974 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4p25c" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.850209 4865 scope.go:117] "RemoveContainer" containerID="8db9159f6495c8c4778543a55247fd86dcf60f6436820539c2edbda25e4f9649" Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.881870 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4p25c"] Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.887524 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4p25c"] Dec 05 06:05:09 crc kubenswrapper[4865]: I1205 06:05:09.918451 4865 scope.go:117] "RemoveContainer" containerID="21afaed35bb1ca688aee904fc63eb79165aff9edba6a2383a5c6cad55ef4dbb2" Dec 05 06:05:10 crc kubenswrapper[4865]: I1205 06:05:10.827314 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" event={"ID":"806823de-8d1e-48f9-964f-86cd689434c7","Type":"ContainerStarted","Data":"f1d6fe8d27a207cf040a6935dcd1c61fdc432f619425350f6cee19e6a0fd1bb9"} Dec 05 06:05:10 crc kubenswrapper[4865]: I1205 06:05:10.852151 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-fdgdj" podStartSLOduration=1.742806138 podStartE2EDuration="8.852132598s" podCreationTimestamp="2025-12-05 06:05:02 +0000 UTC" firstStartedPulling="2025-12-05 06:05:02.678469611 +0000 UTC m=+721.958480833" lastFinishedPulling="2025-12-05 06:05:09.787796061 +0000 UTC m=+729.067807293" observedRunningTime="2025-12-05 06:05:10.851213382 +0000 UTC m=+730.131224604" watchObservedRunningTime="2025-12-05 06:05:10.852132598 +0000 UTC m=+730.132143810" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.013643 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" path="/var/lib/kubelet/pods/3c6f6447-4a73-4a61-b844-8b9244b22637/volumes" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.048844 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.048912 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.884210 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq"] Dec 05 06:05:11 crc kubenswrapper[4865]: E1205 06:05:11.884455 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="registry-server" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.884470 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="registry-server" Dec 05 06:05:11 crc kubenswrapper[4865]: E1205 06:05:11.884480 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="extract-content" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.884486 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="extract-content" Dec 05 06:05:11 crc kubenswrapper[4865]: E1205 06:05:11.884802 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="extract-utilities" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.884809 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="extract-utilities" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.884922 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6f6447-4a73-4a61-b844-8b9244b22637" containerName="registry-server" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.885491 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.888711 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2w6pr" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.904640 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb"] Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.905593 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.907729 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq"] Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.912878 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.926420 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6nk5l"] Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.927722 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.951764 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb"] Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.994123 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.994195 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qdn9\" (UniqueName: \"kubernetes.io/projected/76931d96-861b-4372-9209-98f4b296df1c-kube-api-access-8qdn9\") pod \"nmstate-metrics-7f946cbc9-wm4nq\" (UID: \"76931d96-861b-4372-9209-98f4b296df1c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" Dec 05 06:05:11 crc kubenswrapper[4865]: I1205 06:05:11.994355 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg7kt\" (UniqueName: \"kubernetes.io/projected/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-kube-api-access-wg7kt\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.039670 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp"] Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.040333 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.044483 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.045687 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-j9dsp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.047057 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096195 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m758\" (UniqueName: \"kubernetes.io/projected/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-kube-api-access-5m758\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096249 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096278 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-nmstate-lock\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096329 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-dbus-socket\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: E1205 06:05:12.096424 4865 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096497 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qdn9\" (UniqueName: \"kubernetes.io/projected/76931d96-861b-4372-9209-98f4b296df1c-kube-api-access-8qdn9\") pod \"nmstate-metrics-7f946cbc9-wm4nq\" (UID: \"76931d96-861b-4372-9209-98f4b296df1c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" Dec 05 06:05:12 crc kubenswrapper[4865]: E1205 06:05:12.096544 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-tls-key-pair podName:88204ee2-fa2e-4780-97c0-4ca5aa8554fe nodeName:}" failed. No retries permitted until 2025-12-05 06:05:12.596511922 +0000 UTC m=+731.876523134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-cpqvb" (UID: "88204ee2-fa2e-4780-97c0-4ca5aa8554fe") : secret "openshift-nmstate-webhook" not found Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096757 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg7kt\" (UniqueName: \"kubernetes.io/projected/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-kube-api-access-wg7kt\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.096881 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-ovs-socket\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.106172 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp"] Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.123899 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg7kt\" (UniqueName: \"kubernetes.io/projected/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-kube-api-access-wg7kt\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.124872 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qdn9\" (UniqueName: \"kubernetes.io/projected/76931d96-861b-4372-9209-98f4b296df1c-kube-api-access-8qdn9\") pod \"nmstate-metrics-7f946cbc9-wm4nq\" (UID: \"76931d96-861b-4372-9209-98f4b296df1c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.197792 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcmt\" (UniqueName: \"kubernetes.io/projected/47720eeb-4718-4077-8c64-8184aa08b670-kube-api-access-htcmt\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.197865 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m758\" (UniqueName: \"kubernetes.io/projected/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-kube-api-access-5m758\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.197903 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-nmstate-lock\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.197934 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-dbus-socket\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.197963 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/47720eeb-4718-4077-8c64-8184aa08b670-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.197989 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/47720eeb-4718-4077-8c64-8184aa08b670-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.198007 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-ovs-socket\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.198073 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-ovs-socket\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.198279 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-dbus-socket\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.198307 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-nmstate-lock\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.202653 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.221861 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m758\" (UniqueName: \"kubernetes.io/projected/4b89f3ad-3a92-467c-a613-5c567bbe8e0e-kube-api-access-5m758\") pod \"nmstate-handler-6nk5l\" (UID: \"4b89f3ad-3a92-467c-a613-5c567bbe8e0e\") " pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.247134 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d87879d95-d6dp4"] Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.248238 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.248528 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.280473 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d87879d95-d6dp4"] Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.305969 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/47720eeb-4718-4077-8c64-8184aa08b670-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.306042 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htcmt\" (UniqueName: \"kubernetes.io/projected/47720eeb-4718-4077-8c64-8184aa08b670-kube-api-access-htcmt\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.306109 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/47720eeb-4718-4077-8c64-8184aa08b670-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.307200 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/47720eeb-4718-4077-8c64-8184aa08b670-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.310510 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/47720eeb-4718-4077-8c64-8184aa08b670-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: W1205 06:05:12.313547 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b89f3ad_3a92_467c_a613_5c567bbe8e0e.slice/crio-2b5532d9fd0f88445849a1eacca215dad48a3c1873b86a79d2d45ab3a5ab115a WatchSource:0}: Error finding container 2b5532d9fd0f88445849a1eacca215dad48a3c1873b86a79d2d45ab3a5ab115a: Status 404 returned error can't find the container with id 2b5532d9fd0f88445849a1eacca215dad48a3c1873b86a79d2d45ab3a5ab115a Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.326859 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcmt\" (UniqueName: \"kubernetes.io/projected/47720eeb-4718-4077-8c64-8184aa08b670-kube-api-access-htcmt\") pod \"nmstate-console-plugin-7fbb5f6569-ctjwp\" (UID: \"47720eeb-4718-4077-8c64-8184aa08b670\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.356168 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.407854 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-oauth-serving-cert\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.407908 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-serving-cert\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.407972 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-trusted-ca-bundle\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.408011 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-oauth-config\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.408047 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-config\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.408102 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-service-ca\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.408128 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shx8z\" (UniqueName: \"kubernetes.io/projected/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-kube-api-access-shx8z\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.508949 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-config\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.509046 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-service-ca\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.509074 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shx8z\" (UniqueName: \"kubernetes.io/projected/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-kube-api-access-shx8z\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.509111 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-oauth-serving-cert\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.509136 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-serving-cert\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.509197 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-trusted-ca-bundle\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.509240 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-oauth-config\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.510118 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-config\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.510300 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-oauth-serving-cert\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.511006 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-trusted-ca-bundle\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.512137 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-service-ca\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.514459 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-oauth-config\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.516992 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-console-serving-cert\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.525345 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shx8z\" (UniqueName: \"kubernetes.io/projected/d43a5f18-5aa4-4789-8b32-08f5b7f31eb9-kube-api-access-shx8z\") pod \"console-7d87879d95-d6dp4\" (UID: \"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9\") " pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.570807 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.610472 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.614186 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88204ee2-fa2e-4780-97c0-4ca5aa8554fe-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-cpqvb\" (UID: \"88204ee2-fa2e-4780-97c0-4ca5aa8554fe\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.714726 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq"] Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.766969 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp"] Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.823239 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.843145 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" event={"ID":"47720eeb-4718-4077-8c64-8184aa08b670","Type":"ContainerStarted","Data":"04f9380b509c578a560433030cbeb76a179834b62c8c44caf93be831c1a65ecb"} Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.844076 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" event={"ID":"76931d96-861b-4372-9209-98f4b296df1c","Type":"ContainerStarted","Data":"f45f85f46c95a1ac162760271e44879cd0d1a1b75cf85bb3f110739e390123be"} Dec 05 06:05:12 crc kubenswrapper[4865]: I1205 06:05:12.844901 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6nk5l" event={"ID":"4b89f3ad-3a92-467c-a613-5c567bbe8e0e","Type":"ContainerStarted","Data":"2b5532d9fd0f88445849a1eacca215dad48a3c1873b86a79d2d45ab3a5ab115a"} Dec 05 06:05:13 crc kubenswrapper[4865]: I1205 06:05:13.029624 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d87879d95-d6dp4"] Dec 05 06:05:13 crc kubenswrapper[4865]: W1205 06:05:13.033092 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43a5f18_5aa4_4789_8b32_08f5b7f31eb9.slice/crio-f509100f5b82c11fe99c7a7c1d6a468d2c0fd836eadcccc6026bc04069a749ab WatchSource:0}: Error finding container f509100f5b82c11fe99c7a7c1d6a468d2c0fd836eadcccc6026bc04069a749ab: Status 404 returned error can't find the container with id f509100f5b82c11fe99c7a7c1d6a468d2c0fd836eadcccc6026bc04069a749ab Dec 05 06:05:13 crc kubenswrapper[4865]: I1205 06:05:13.051107 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb"] Dec 05 06:05:13 crc kubenswrapper[4865]: W1205 06:05:13.057350 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88204ee2_fa2e_4780_97c0_4ca5aa8554fe.slice/crio-448a549f1c407b873f21d6bdd93eaae4e7eecd34bbcad6a05664f0897e3289df WatchSource:0}: Error finding container 448a549f1c407b873f21d6bdd93eaae4e7eecd34bbcad6a05664f0897e3289df: Status 404 returned error can't find the container with id 448a549f1c407b873f21d6bdd93eaae4e7eecd34bbcad6a05664f0897e3289df Dec 05 06:05:13 crc kubenswrapper[4865]: I1205 06:05:13.852587 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d87879d95-d6dp4" event={"ID":"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9","Type":"ContainerStarted","Data":"a81caf608ab189df1f2cb4c7ccfe08422c9caa9ac0424d7b659a7fdf8b53003c"} Dec 05 06:05:13 crc kubenswrapper[4865]: I1205 06:05:13.852931 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d87879d95-d6dp4" event={"ID":"d43a5f18-5aa4-4789-8b32-08f5b7f31eb9","Type":"ContainerStarted","Data":"f509100f5b82c11fe99c7a7c1d6a468d2c0fd836eadcccc6026bc04069a749ab"} Dec 05 06:05:13 crc kubenswrapper[4865]: I1205 06:05:13.854974 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" event={"ID":"88204ee2-fa2e-4780-97c0-4ca5aa8554fe","Type":"ContainerStarted","Data":"448a549f1c407b873f21d6bdd93eaae4e7eecd34bbcad6a05664f0897e3289df"} Dec 05 06:05:13 crc kubenswrapper[4865]: I1205 06:05:13.874979 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d87879d95-d6dp4" podStartSLOduration=1.874964509 podStartE2EDuration="1.874964509s" podCreationTimestamp="2025-12-05 06:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:05:13.872921921 +0000 UTC m=+733.152933143" watchObservedRunningTime="2025-12-05 06:05:13.874964509 +0000 UTC m=+733.154975731" Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.870056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6nk5l" event={"ID":"4b89f3ad-3a92-467c-a613-5c567bbe8e0e","Type":"ContainerStarted","Data":"b67447d3f9a692c11101ef7eb7366ed399ba5344b31af21ddfc87ea5adad8477"} Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.870917 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.873247 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" event={"ID":"88204ee2-fa2e-4780-97c0-4ca5aa8554fe","Type":"ContainerStarted","Data":"1993aa1b88c09f7064ab01367fc5214871ebf483a35529057979555fb8c75f14"} Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.873382 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.875227 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" event={"ID":"47720eeb-4718-4077-8c64-8184aa08b670","Type":"ContainerStarted","Data":"e617105ab9fb1e88a77198132574d11aec97d98cc6c591a5912f52d78cdeb052"} Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.878507 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" event={"ID":"76931d96-861b-4372-9209-98f4b296df1c","Type":"ContainerStarted","Data":"80d54b106a9a5d4721923ca51e09045899596b84aeef2719a9ae7ba67eb27df9"} Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.897117 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6nk5l" podStartSLOduration=1.9215405909999999 podStartE2EDuration="4.89709899s" podCreationTimestamp="2025-12-05 06:05:11 +0000 UTC" firstStartedPulling="2025-12-05 06:05:12.32211797 +0000 UTC m=+731.602129192" lastFinishedPulling="2025-12-05 06:05:15.297676359 +0000 UTC m=+734.577687591" observedRunningTime="2025-12-05 06:05:15.895765062 +0000 UTC m=+735.175776304" watchObservedRunningTime="2025-12-05 06:05:15.89709899 +0000 UTC m=+735.177110212" Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.920101 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" podStartSLOduration=2.681329826 podStartE2EDuration="4.920076567s" podCreationTimestamp="2025-12-05 06:05:11 +0000 UTC" firstStartedPulling="2025-12-05 06:05:13.06040946 +0000 UTC m=+732.340420702" lastFinishedPulling="2025-12-05 06:05:15.299156201 +0000 UTC m=+734.579167443" observedRunningTime="2025-12-05 06:05:15.914453446 +0000 UTC m=+735.194464668" watchObservedRunningTime="2025-12-05 06:05:15.920076567 +0000 UTC m=+735.200087809" Dec 05 06:05:15 crc kubenswrapper[4865]: I1205 06:05:15.948751 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-ctjwp" podStartSLOduration=1.428336276 podStartE2EDuration="3.948727436s" podCreationTimestamp="2025-12-05 06:05:12 +0000 UTC" firstStartedPulling="2025-12-05 06:05:12.778628647 +0000 UTC m=+732.058639869" lastFinishedPulling="2025-12-05 06:05:15.299019797 +0000 UTC m=+734.579031029" observedRunningTime="2025-12-05 06:05:15.942108826 +0000 UTC m=+735.222120058" watchObservedRunningTime="2025-12-05 06:05:15.948727436 +0000 UTC m=+735.228738668" Dec 05 06:05:18 crc kubenswrapper[4865]: I1205 06:05:18.902984 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" event={"ID":"76931d96-861b-4372-9209-98f4b296df1c","Type":"ContainerStarted","Data":"a010342d07d55aa448b7dac18ea23aebf843bd9988e24340e297127d956819a7"} Dec 05 06:05:18 crc kubenswrapper[4865]: I1205 06:05:18.933762 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wm4nq" podStartSLOduration=2.818767753 podStartE2EDuration="7.933742555s" podCreationTimestamp="2025-12-05 06:05:11 +0000 UTC" firstStartedPulling="2025-12-05 06:05:12.725111907 +0000 UTC m=+732.005123129" lastFinishedPulling="2025-12-05 06:05:17.840086719 +0000 UTC m=+737.120097931" observedRunningTime="2025-12-05 06:05:18.923606166 +0000 UTC m=+738.203617418" watchObservedRunningTime="2025-12-05 06:05:18.933742555 +0000 UTC m=+738.213753787" Dec 05 06:05:22 crc kubenswrapper[4865]: I1205 06:05:22.293221 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6nk5l" Dec 05 06:05:22 crc kubenswrapper[4865]: I1205 06:05:22.571201 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:22 crc kubenswrapper[4865]: I1205 06:05:22.571291 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:22 crc kubenswrapper[4865]: I1205 06:05:22.576902 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:22 crc kubenswrapper[4865]: I1205 06:05:22.941382 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d87879d95-d6dp4" Dec 05 06:05:23 crc kubenswrapper[4865]: I1205 06:05:23.028754 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lclsv"] Dec 05 06:05:32 crc kubenswrapper[4865]: I1205 06:05:32.832137 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-cpqvb" Dec 05 06:05:41 crc kubenswrapper[4865]: I1205 06:05:41.051532 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:05:41 crc kubenswrapper[4865]: I1205 06:05:41.052223 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:05:41 crc kubenswrapper[4865]: I1205 06:05:41.052288 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:05:41 crc kubenswrapper[4865]: I1205 06:05:41.052989 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80e4ab6bf8f2776e8ac270a6781a82ac7a7696de67acef027bfa81b854301141"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:05:41 crc kubenswrapper[4865]: I1205 06:05:41.053053 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://80e4ab6bf8f2776e8ac270a6781a82ac7a7696de67acef027bfa81b854301141" gracePeriod=600 Dec 05 06:05:42 crc kubenswrapper[4865]: I1205 06:05:42.106950 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="80e4ab6bf8f2776e8ac270a6781a82ac7a7696de67acef027bfa81b854301141" exitCode=0 Dec 05 06:05:42 crc kubenswrapper[4865]: I1205 06:05:42.107149 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"80e4ab6bf8f2776e8ac270a6781a82ac7a7696de67acef027bfa81b854301141"} Dec 05 06:05:42 crc kubenswrapper[4865]: I1205 06:05:42.107448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"d163070852eac0c87032f69fbdb534afbbd8e4f78e69ec919b3b74b72f841eab"} Dec 05 06:05:42 crc kubenswrapper[4865]: I1205 06:05:42.107469 4865 scope.go:117] "RemoveContainer" containerID="0fa076b7876af986bac2b8667cbf6d275c93b22d832f82ad2c83ef7e91ad5c2a" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.019382 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb"] Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.021736 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.023893 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.036163 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb"] Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.123040 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.123099 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.123138 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x546v\" (UniqueName: \"kubernetes.io/projected/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-kube-api-access-x546v\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.224367 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.224442 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.224493 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x546v\" (UniqueName: \"kubernetes.io/projected/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-kube-api-access-x546v\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.225258 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.225520 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.244027 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x546v\" (UniqueName: \"kubernetes.io/projected/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-kube-api-access-x546v\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.338077 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:47 crc kubenswrapper[4865]: I1205 06:05:47.575740 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb"] Dec 05 06:05:48 crc kubenswrapper[4865]: I1205 06:05:48.090619 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lclsv" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerName="console" containerID="cri-o://68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0" gracePeriod=15 Dec 05 06:05:48 crc kubenswrapper[4865]: I1205 06:05:48.157800 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" event={"ID":"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec","Type":"ContainerStarted","Data":"53596d03b972f4a9de94d0931769ffa5e9b2d86fcf8b514f68e53d69ae9e6fd1"} Dec 05 06:05:48 crc kubenswrapper[4865]: I1205 06:05:48.958795 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lclsv_dff0db39-9f6f-4455-8cab-8d4cdce33b04/console/0.log" Dec 05 06:05:48 crc kubenswrapper[4865]: I1205 06:05:48.959238 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.050967 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-service-ca\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.051346 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-config\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.051482 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgwgk\" (UniqueName: \"kubernetes.io/projected/dff0db39-9f6f-4455-8cab-8d4cdce33b04-kube-api-access-bgwgk\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.051606 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-oauth-config\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.051671 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-serving-cert\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.051731 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-trusted-ca-bundle\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.051801 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-oauth-serving-cert\") pod \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\" (UID: \"dff0db39-9f6f-4455-8cab-8d4cdce33b04\") " Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.052785 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.052802 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-config" (OuterVolumeSpecName: "console-config") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.052888 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.052981 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-service-ca" (OuterVolumeSpecName: "service-ca") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.060108 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.060615 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff0db39-9f6f-4455-8cab-8d4cdce33b04-kube-api-access-bgwgk" (OuterVolumeSpecName: "kube-api-access-bgwgk") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "kube-api-access-bgwgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.060943 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dff0db39-9f6f-4455-8cab-8d4cdce33b04" (UID: "dff0db39-9f6f-4455-8cab-8d4cdce33b04"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153814 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgwgk\" (UniqueName: \"kubernetes.io/projected/dff0db39-9f6f-4455-8cab-8d4cdce33b04-kube-api-access-bgwgk\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153883 4865 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153896 4865 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153908 4865 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153922 4865 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153935 4865 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.153948 4865 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff0db39-9f6f-4455-8cab-8d4cdce33b04-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.164617 4865 generic.go:334] "Generic (PLEG): container finished" podID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerID="2cc3227474fa57031196f7bad3039c71c75b6bea4f9714cc160a8778e53e8566" exitCode=0 Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.164691 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" event={"ID":"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec","Type":"ContainerDied","Data":"2cc3227474fa57031196f7bad3039c71c75b6bea4f9714cc160a8778e53e8566"} Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.172254 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lclsv_dff0db39-9f6f-4455-8cab-8d4cdce33b04/console/0.log" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.172332 4865 generic.go:334] "Generic (PLEG): container finished" podID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerID="68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0" exitCode=2 Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.172390 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lclsv" event={"ID":"dff0db39-9f6f-4455-8cab-8d4cdce33b04","Type":"ContainerDied","Data":"68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0"} Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.172441 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lclsv" event={"ID":"dff0db39-9f6f-4455-8cab-8d4cdce33b04","Type":"ContainerDied","Data":"1568b97006d42371f19f0b190c8cdc35d7f16852dd89d4843c36dfc21fe5c6eb"} Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.172474 4865 scope.go:117] "RemoveContainer" containerID="68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.172639 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lclsv" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.210772 4865 scope.go:117] "RemoveContainer" containerID="68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0" Dec 05 06:05:49 crc kubenswrapper[4865]: E1205 06:05:49.211987 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0\": container with ID starting with 68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0 not found: ID does not exist" containerID="68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.212187 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0"} err="failed to get container status \"68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0\": rpc error: code = NotFound desc = could not find container \"68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0\": container with ID starting with 68ebc996950a79cbb4e079dc5453a2528493c989198587a834d06c2226ca36c0 not found: ID does not exist" Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.239498 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lclsv"] Dec 05 06:05:49 crc kubenswrapper[4865]: I1205 06:05:49.259634 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lclsv"] Dec 05 06:05:51 crc kubenswrapper[4865]: I1205 06:05:51.018452 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" path="/var/lib/kubelet/pods/dff0db39-9f6f-4455-8cab-8d4cdce33b04/volumes" Dec 05 06:05:52 crc kubenswrapper[4865]: I1205 06:05:52.201060 4865 generic.go:334] "Generic (PLEG): container finished" podID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerID="98a8229b39b9f182a7ef0ada99b01a069d6a3bd7e4c4c7194a0f90c3dc48a0ef" exitCode=0 Dec 05 06:05:52 crc kubenswrapper[4865]: I1205 06:05:52.201134 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" event={"ID":"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec","Type":"ContainerDied","Data":"98a8229b39b9f182a7ef0ada99b01a069d6a3bd7e4c4c7194a0f90c3dc48a0ef"} Dec 05 06:05:53 crc kubenswrapper[4865]: I1205 06:05:53.214013 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" event={"ID":"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec","Type":"ContainerDied","Data":"b911b6f2e6543cce8d1da6963d541ea55de637a51d7aa7fd2152e1dea12d3fa8"} Dec 05 06:05:53 crc kubenswrapper[4865]: I1205 06:05:53.213936 4865 generic.go:334] "Generic (PLEG): container finished" podID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerID="b911b6f2e6543cce8d1da6963d541ea55de637a51d7aa7fd2152e1dea12d3fa8" exitCode=0 Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.552737 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.746152 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x546v\" (UniqueName: \"kubernetes.io/projected/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-kube-api-access-x546v\") pod \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.746241 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-bundle\") pod \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.746300 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-util\") pod \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\" (UID: \"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec\") " Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.747846 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-bundle" (OuterVolumeSpecName: "bundle") pod "24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" (UID: "24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.756570 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-kube-api-access-x546v" (OuterVolumeSpecName: "kube-api-access-x546v") pod "24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" (UID: "24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec"). InnerVolumeSpecName "kube-api-access-x546v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.758672 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-util" (OuterVolumeSpecName: "util") pod "24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" (UID: "24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.848333 4865 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-util\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.848381 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x546v\" (UniqueName: \"kubernetes.io/projected/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-kube-api-access-x546v\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:54 crc kubenswrapper[4865]: I1205 06:05:54.848397 4865 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:05:55 crc kubenswrapper[4865]: I1205 06:05:55.236425 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" event={"ID":"24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec","Type":"ContainerDied","Data":"53596d03b972f4a9de94d0931769ffa5e9b2d86fcf8b514f68e53d69ae9e6fd1"} Dec 05 06:05:55 crc kubenswrapper[4865]: I1205 06:05:55.236507 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53596d03b972f4a9de94d0931769ffa5e9b2d86fcf8b514f68e53d69ae9e6fd1" Dec 05 06:05:55 crc kubenswrapper[4865]: I1205 06:05:55.237123 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.668233 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr"] Dec 05 06:06:05 crc kubenswrapper[4865]: E1205 06:06:05.669143 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="extract" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669158 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="extract" Dec 05 06:06:05 crc kubenswrapper[4865]: E1205 06:06:05.669180 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerName="console" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669188 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerName="console" Dec 05 06:06:05 crc kubenswrapper[4865]: E1205 06:06:05.669203 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="util" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669211 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="util" Dec 05 06:06:05 crc kubenswrapper[4865]: E1205 06:06:05.669228 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="pull" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669237 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="pull" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669382 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec" containerName="extract" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669400 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff0db39-9f6f-4455-8cab-8d4cdce33b04" containerName="console" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.669816 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.674447 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.674657 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.674841 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4vqpr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.675106 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.675258 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.686785 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/46237ec7-567d-47d0-9994-120d3f2039e8-webhook-cert\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.686882 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/46237ec7-567d-47d0-9994-120d3f2039e8-apiservice-cert\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.686909 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2sq5\" (UniqueName: \"kubernetes.io/projected/46237ec7-567d-47d0-9994-120d3f2039e8-kube-api-access-x2sq5\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.752398 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr"] Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.787890 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/46237ec7-567d-47d0-9994-120d3f2039e8-apiservice-cert\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.787950 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2sq5\" (UniqueName: \"kubernetes.io/projected/46237ec7-567d-47d0-9994-120d3f2039e8-kube-api-access-x2sq5\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.787992 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/46237ec7-567d-47d0-9994-120d3f2039e8-webhook-cert\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.794918 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/46237ec7-567d-47d0-9994-120d3f2039e8-apiservice-cert\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.808351 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/46237ec7-567d-47d0-9994-120d3f2039e8-webhook-cert\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.824777 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2sq5\" (UniqueName: \"kubernetes.io/projected/46237ec7-567d-47d0-9994-120d3f2039e8-kube-api-access-x2sq5\") pod \"metallb-operator-controller-manager-565b7bc7b8-6qwxr\" (UID: \"46237ec7-567d-47d0-9994-120d3f2039e8\") " pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:05 crc kubenswrapper[4865]: I1205 06:06:05.988929 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.002301 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56"] Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.003096 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.010441 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.010714 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.010976 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-85rgw" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.091740 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56"] Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.093593 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkgq\" (UniqueName: \"kubernetes.io/projected/d30b601d-b803-4d64-923f-b085545350ee-kube-api-access-mbkgq\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.093746 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d30b601d-b803-4d64-923f-b085545350ee-webhook-cert\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.093797 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d30b601d-b803-4d64-923f-b085545350ee-apiservice-cert\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.194722 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d30b601d-b803-4d64-923f-b085545350ee-apiservice-cert\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.194772 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkgq\" (UniqueName: \"kubernetes.io/projected/d30b601d-b803-4d64-923f-b085545350ee-kube-api-access-mbkgq\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.194858 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d30b601d-b803-4d64-923f-b085545350ee-webhook-cert\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.217918 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d30b601d-b803-4d64-923f-b085545350ee-webhook-cert\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.218027 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d30b601d-b803-4d64-923f-b085545350ee-apiservice-cert\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.224542 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkgq\" (UniqueName: \"kubernetes.io/projected/d30b601d-b803-4d64-923f-b085545350ee-kube-api-access-mbkgq\") pod \"metallb-operator-webhook-server-5b4f9f77c-pdn56\" (UID: \"d30b601d-b803-4d64-923f-b085545350ee\") " pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.366882 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.560477 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr"] Dec 05 06:06:06 crc kubenswrapper[4865]: I1205 06:06:06.876714 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56"] Dec 05 06:06:06 crc kubenswrapper[4865]: W1205 06:06:06.880780 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd30b601d_b803_4d64_923f_b085545350ee.slice/crio-feae40b97a233d61708a241ad44c777af8b30ec23e468bdf0b72a68b5fdbf52b WatchSource:0}: Error finding container feae40b97a233d61708a241ad44c777af8b30ec23e468bdf0b72a68b5fdbf52b: Status 404 returned error can't find the container with id feae40b97a233d61708a241ad44c777af8b30ec23e468bdf0b72a68b5fdbf52b Dec 05 06:06:07 crc kubenswrapper[4865]: I1205 06:06:07.327543 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" event={"ID":"46237ec7-567d-47d0-9994-120d3f2039e8","Type":"ContainerStarted","Data":"7d33caac6dcea54a76af64c13bea57f96ad1f25b83a537e86d43446334d60055"} Dec 05 06:06:07 crc kubenswrapper[4865]: I1205 06:06:07.329006 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" event={"ID":"d30b601d-b803-4d64-923f-b085545350ee","Type":"ContainerStarted","Data":"feae40b97a233d61708a241ad44c777af8b30ec23e468bdf0b72a68b5fdbf52b"} Dec 05 06:06:13 crc kubenswrapper[4865]: I1205 06:06:13.381640 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" event={"ID":"d30b601d-b803-4d64-923f-b085545350ee","Type":"ContainerStarted","Data":"ca9842e69155592ded1f53cdf20c1f3c241d071825fdbbb2c998bbd8f8e54327"} Dec 05 06:06:13 crc kubenswrapper[4865]: I1205 06:06:13.383479 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:13 crc kubenswrapper[4865]: I1205 06:06:13.385206 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" event={"ID":"46237ec7-567d-47d0-9994-120d3f2039e8","Type":"ContainerStarted","Data":"977bf4053b8e3403717c35705f134726052ad43d4e3d260e82c71e6282040173"} Dec 05 06:06:13 crc kubenswrapper[4865]: I1205 06:06:13.385946 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:13 crc kubenswrapper[4865]: I1205 06:06:13.409386 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" podStartSLOduration=2.993267593 podStartE2EDuration="8.409367671s" podCreationTimestamp="2025-12-05 06:06:05 +0000 UTC" firstStartedPulling="2025-12-05 06:06:06.884062947 +0000 UTC m=+786.164074169" lastFinishedPulling="2025-12-05 06:06:12.300163025 +0000 UTC m=+791.580174247" observedRunningTime="2025-12-05 06:06:13.406602663 +0000 UTC m=+792.686613885" watchObservedRunningTime="2025-12-05 06:06:13.409367671 +0000 UTC m=+792.689378893" Dec 05 06:06:13 crc kubenswrapper[4865]: I1205 06:06:13.438356 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" podStartSLOduration=2.806457372 podStartE2EDuration="8.438335351s" podCreationTimestamp="2025-12-05 06:06:05 +0000 UTC" firstStartedPulling="2025-12-05 06:06:06.576229902 +0000 UTC m=+785.856241124" lastFinishedPulling="2025-12-05 06:06:12.208107881 +0000 UTC m=+791.488119103" observedRunningTime="2025-12-05 06:06:13.436842308 +0000 UTC m=+792.716853540" watchObservedRunningTime="2025-12-05 06:06:13.438335351 +0000 UTC m=+792.718346583" Dec 05 06:06:26 crc kubenswrapper[4865]: I1205 06:06:26.375929 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b4f9f77c-pdn56" Dec 05 06:06:45 crc kubenswrapper[4865]: I1205 06:06:45.992666 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-565b7bc7b8-6qwxr" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.728500 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps"] Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.729436 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.736054 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.736624 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vh72x" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.749418 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jjpk7"] Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.751595 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.755011 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.756106 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.789835 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps"] Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.861403 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jmvs2"] Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.862560 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jmvs2" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.867214 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.867240 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.867330 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.872681 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lj2rx" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.876396 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-m74rz"] Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.877588 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.879546 4865 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.890029 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-m74rz"] Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.898833 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-startup\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.898871 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-metrics-certs\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.898917 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-conf\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.898934 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4gz\" (UniqueName: \"kubernetes.io/projected/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-kube-api-access-mm4gz\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.898951 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-reloader\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.898994 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/048569aa-8159-43b3-9ed2-55cef99d90bb-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-vhnps\" (UID: \"048569aa-8159-43b3-9ed2-55cef99d90bb\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.899013 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzhl\" (UniqueName: \"kubernetes.io/projected/048569aa-8159-43b3-9ed2-55cef99d90bb-kube-api-access-gmzhl\") pod \"frr-k8s-webhook-server-7fcb986d4-vhnps\" (UID: \"048569aa-8159-43b3-9ed2-55cef99d90bb\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.899032 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-sockets\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:46 crc kubenswrapper[4865]: I1205 06:06:46.899050 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-metrics\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000328 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-reloader\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000418 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ntv2\" (UniqueName: \"kubernetes.io/projected/ce563705-9a7e-4202-a8f4-512c17a481fb-kube-api-access-6ntv2\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000486 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cvw\" (UniqueName: \"kubernetes.io/projected/a03656bf-d0cc-4e06-b6ce-470766d186d0-kube-api-access-84cvw\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000516 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000571 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/048569aa-8159-43b3-9ed2-55cef99d90bb-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-vhnps\" (UID: \"048569aa-8159-43b3-9ed2-55cef99d90bb\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000599 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-sockets\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000641 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzhl\" (UniqueName: \"kubernetes.io/projected/048569aa-8159-43b3-9ed2-55cef99d90bb-kube-api-access-gmzhl\") pod \"frr-k8s-webhook-server-7fcb986d4-vhnps\" (UID: \"048569aa-8159-43b3-9ed2-55cef99d90bb\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000668 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-metrics\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000695 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-metrics-certs\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000747 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-startup\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000773 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-metrics-certs\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000881 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-cert\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000908 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-metrics-certs\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000963 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce563705-9a7e-4202-a8f4-512c17a481fb-metallb-excludel2\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.000989 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-conf\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.001038 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4gz\" (UniqueName: \"kubernetes.io/projected/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-kube-api-access-mm4gz\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.002216 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-metrics\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.002271 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-sockets\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.003047 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-startup\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.003250 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-reloader\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.003342 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-frr-conf\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.008412 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-metrics-certs\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.011507 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/048569aa-8159-43b3-9ed2-55cef99d90bb-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-vhnps\" (UID: \"048569aa-8159-43b3-9ed2-55cef99d90bb\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.018710 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4gz\" (UniqueName: \"kubernetes.io/projected/0f46c5b9-45e6-4002-a5e8-e07ecf828a80-kube-api-access-mm4gz\") pod \"frr-k8s-jjpk7\" (UID: \"0f46c5b9-45e6-4002-a5e8-e07ecf828a80\") " pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.028763 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzhl\" (UniqueName: \"kubernetes.io/projected/048569aa-8159-43b3-9ed2-55cef99d90bb-kube-api-access-gmzhl\") pod \"frr-k8s-webhook-server-7fcb986d4-vhnps\" (UID: \"048569aa-8159-43b3-9ed2-55cef99d90bb\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.045115 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.064323 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103125 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-metrics-certs\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103260 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-cert\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103280 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-metrics-certs\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103327 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce563705-9a7e-4202-a8f4-512c17a481fb-metallb-excludel2\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103358 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ntv2\" (UniqueName: \"kubernetes.io/projected/ce563705-9a7e-4202-a8f4-512c17a481fb-kube-api-access-6ntv2\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103419 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cvw\" (UniqueName: \"kubernetes.io/projected/a03656bf-d0cc-4e06-b6ce-470766d186d0-kube-api-access-84cvw\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.103442 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: E1205 06:06:47.104521 4865 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 06:06:47 crc kubenswrapper[4865]: E1205 06:06:47.104602 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist podName:ce563705-9a7e-4202-a8f4-512c17a481fb nodeName:}" failed. No retries permitted until 2025-12-05 06:06:47.604576496 +0000 UTC m=+826.884587718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist") pod "speaker-jmvs2" (UID: "ce563705-9a7e-4202-a8f4-512c17a481fb") : secret "metallb-memberlist" not found Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.104771 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce563705-9a7e-4202-a8f4-512c17a481fb-metallb-excludel2\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: E1205 06:06:47.104981 4865 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 05 06:06:47 crc kubenswrapper[4865]: E1205 06:06:47.105060 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-metrics-certs podName:a03656bf-d0cc-4e06-b6ce-470766d186d0 nodeName:}" failed. No retries permitted until 2025-12-05 06:06:47.605032248 +0000 UTC m=+826.885043690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-metrics-certs") pod "controller-f8648f98b-m74rz" (UID: "a03656bf-d0cc-4e06-b6ce-470766d186d0") : secret "controller-certs-secret" not found Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.109314 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-metrics-certs\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.112584 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-cert\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.124390 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ntv2\" (UniqueName: \"kubernetes.io/projected/ce563705-9a7e-4202-a8f4-512c17a481fb-kube-api-access-6ntv2\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.128220 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cvw\" (UniqueName: \"kubernetes.io/projected/a03656bf-d0cc-4e06-b6ce-470766d186d0-kube-api-access-84cvw\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.333833 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps"] Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.612739 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.612842 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-metrics-certs\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: E1205 06:06:47.613295 4865 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 06:06:47 crc kubenswrapper[4865]: E1205 06:06:47.613371 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist podName:ce563705-9a7e-4202-a8f4-512c17a481fb nodeName:}" failed. No retries permitted until 2025-12-05 06:06:48.613346682 +0000 UTC m=+827.893357914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist") pod "speaker-jmvs2" (UID: "ce563705-9a7e-4202-a8f4-512c17a481fb") : secret "metallb-memberlist" not found Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.621607 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" event={"ID":"048569aa-8159-43b3-9ed2-55cef99d90bb","Type":"ContainerStarted","Data":"4d86539b4a2a141c7ce8d570dca2999ceb6768bbdad54fed2b0db1ddf2295e5e"} Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.622914 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a03656bf-d0cc-4e06-b6ce-470766d186d0-metrics-certs\") pod \"controller-f8648f98b-m74rz\" (UID: \"a03656bf-d0cc-4e06-b6ce-470766d186d0\") " pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.623562 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"1b6bca44341de6dd13007ac565261d244c43f00c4aac410b131ab2ec27e06096"} Dec 05 06:06:47 crc kubenswrapper[4865]: I1205 06:06:47.788994 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:48 crc kubenswrapper[4865]: I1205 06:06:48.403775 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-m74rz"] Dec 05 06:06:48 crc kubenswrapper[4865]: I1205 06:06:48.636874 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:48 crc kubenswrapper[4865]: I1205 06:06:48.640557 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-m74rz" event={"ID":"a03656bf-d0cc-4e06-b6ce-470766d186d0","Type":"ContainerStarted","Data":"dc0b352299fc2a8d71f86ef794b018d4d312793b72d5f225c773d2e42ab7a565"} Dec 05 06:06:48 crc kubenswrapper[4865]: I1205 06:06:48.640616 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-m74rz" event={"ID":"a03656bf-d0cc-4e06-b6ce-470766d186d0","Type":"ContainerStarted","Data":"3acf21f66e8cf9b93b060a8086e74535fbfddb81c2115eedcd6efc416a7d61fe"} Dec 05 06:06:48 crc kubenswrapper[4865]: I1205 06:06:48.651336 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce563705-9a7e-4202-a8f4-512c17a481fb-memberlist\") pod \"speaker-jmvs2\" (UID: \"ce563705-9a7e-4202-a8f4-512c17a481fb\") " pod="metallb-system/speaker-jmvs2" Dec 05 06:06:48 crc kubenswrapper[4865]: I1205 06:06:48.676444 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jmvs2" Dec 05 06:06:48 crc kubenswrapper[4865]: W1205 06:06:48.695351 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce563705_9a7e_4202_a8f4_512c17a481fb.slice/crio-12df301063d3b7f84ee3244000b4e2ec334f4f9d35ce30bb3e5147c9e8a9e960 WatchSource:0}: Error finding container 12df301063d3b7f84ee3244000b4e2ec334f4f9d35ce30bb3e5147c9e8a9e960: Status 404 returned error can't find the container with id 12df301063d3b7f84ee3244000b4e2ec334f4f9d35ce30bb3e5147c9e8a9e960 Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.650793 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-m74rz" event={"ID":"a03656bf-d0cc-4e06-b6ce-470766d186d0","Type":"ContainerStarted","Data":"5d2f8bae4b8d0b516b33ce4d930d18057a020193df3e682df7b352d42c27bb4d"} Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.651165 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.656986 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jmvs2" event={"ID":"ce563705-9a7e-4202-a8f4-512c17a481fb","Type":"ContainerStarted","Data":"9e2f8c7b5de7760b7441c33d186ad04d4cfee8463a9afd8ccccc23ea61fccb4c"} Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.657256 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jmvs2" event={"ID":"ce563705-9a7e-4202-a8f4-512c17a481fb","Type":"ContainerStarted","Data":"f5b7cc5880b21d89436441bdcfc19c06fae4186ab732d4108ea88569b95f2fa7"} Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.657356 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jmvs2" event={"ID":"ce563705-9a7e-4202-a8f4-512c17a481fb","Type":"ContainerStarted","Data":"12df301063d3b7f84ee3244000b4e2ec334f4f9d35ce30bb3e5147c9e8a9e960"} Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.657513 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jmvs2" Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.693616 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-m74rz" podStartSLOduration=3.693596438 podStartE2EDuration="3.693596438s" podCreationTimestamp="2025-12-05 06:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:06:49.691928611 +0000 UTC m=+828.971939833" watchObservedRunningTime="2025-12-05 06:06:49.693596438 +0000 UTC m=+828.973607660" Dec 05 06:06:49 crc kubenswrapper[4865]: I1205 06:06:49.734230 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jmvs2" podStartSLOduration=3.734209327 podStartE2EDuration="3.734209327s" podCreationTimestamp="2025-12-05 06:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:06:49.731924342 +0000 UTC m=+829.011935564" watchObservedRunningTime="2025-12-05 06:06:49.734209327 +0000 UTC m=+829.014220549" Dec 05 06:06:56 crc kubenswrapper[4865]: I1205 06:06:56.707213 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" event={"ID":"048569aa-8159-43b3-9ed2-55cef99d90bb","Type":"ContainerStarted","Data":"836f2c38d8fb817fd954e10b79b5fad75cca2f4e39ff899a3f2c64d5677c92bc"} Dec 05 06:06:56 crc kubenswrapper[4865]: I1205 06:06:56.707979 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:06:56 crc kubenswrapper[4865]: I1205 06:06:56.709679 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f46c5b9-45e6-4002-a5e8-e07ecf828a80" containerID="002848338c24bee74cf24d2b5e579eee05426c38454cc7d522e239a70cfcecd3" exitCode=0 Dec 05 06:06:56 crc kubenswrapper[4865]: I1205 06:06:56.709722 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerDied","Data":"002848338c24bee74cf24d2b5e579eee05426c38454cc7d522e239a70cfcecd3"} Dec 05 06:06:56 crc kubenswrapper[4865]: I1205 06:06:56.730182 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" podStartSLOduration=2.141557638 podStartE2EDuration="10.73016168s" podCreationTimestamp="2025-12-05 06:06:46 +0000 UTC" firstStartedPulling="2025-12-05 06:06:47.33640805 +0000 UTC m=+826.616419272" lastFinishedPulling="2025-12-05 06:06:55.925012092 +0000 UTC m=+835.205023314" observedRunningTime="2025-12-05 06:06:56.728783111 +0000 UTC m=+836.008794343" watchObservedRunningTime="2025-12-05 06:06:56.73016168 +0000 UTC m=+836.010172912" Dec 05 06:06:57 crc kubenswrapper[4865]: I1205 06:06:57.721868 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f46c5b9-45e6-4002-a5e8-e07ecf828a80" containerID="722f2c54fee34542e240b458d30b2e4d6f6c30d210b464214f2ffa1120126428" exitCode=0 Dec 05 06:06:57 crc kubenswrapper[4865]: I1205 06:06:57.722990 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerDied","Data":"722f2c54fee34542e240b458d30b2e4d6f6c30d210b464214f2ffa1120126428"} Dec 05 06:06:58 crc kubenswrapper[4865]: I1205 06:06:58.682181 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jmvs2" Dec 05 06:06:58 crc kubenswrapper[4865]: I1205 06:06:58.731558 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f46c5b9-45e6-4002-a5e8-e07ecf828a80" containerID="595a2bfd305f7708834d80c2d562d83f75c07301a6e5655477d1d6d1e4018e06" exitCode=0 Dec 05 06:06:58 crc kubenswrapper[4865]: I1205 06:06:58.731628 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerDied","Data":"595a2bfd305f7708834d80c2d562d83f75c07301a6e5655477d1d6d1e4018e06"} Dec 05 06:06:59 crc kubenswrapper[4865]: I1205 06:06:59.791449 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"bfd5506a71c54f743378dd8b7397caec85550d908af8843a89335f9748b4c6be"} Dec 05 06:06:59 crc kubenswrapper[4865]: I1205 06:06:59.791716 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"6f973660e58d8233aa8d9a90e24970267fa926db65a1db87170890f7d7e97f86"} Dec 05 06:06:59 crc kubenswrapper[4865]: I1205 06:06:59.791726 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"52f20e13c2d7bba444d55a82b187c1df1489d54853b0fcee6c7636ba695301f9"} Dec 05 06:06:59 crc kubenswrapper[4865]: I1205 06:06:59.791735 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"b1977c20aed18c7c7ddec937105801abc3d2ba9065522e22d535826f6107cc6e"} Dec 05 06:06:59 crc kubenswrapper[4865]: I1205 06:06:59.791754 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"470b0850aec4b60392d3e11f7203a7bd5598f765a2702aaaa1d85fdfc7dc7a53"} Dec 05 06:07:00 crc kubenswrapper[4865]: I1205 06:07:00.803539 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjpk7" event={"ID":"0f46c5b9-45e6-4002-a5e8-e07ecf828a80","Type":"ContainerStarted","Data":"16382f39689e20a1b824a82ce6a6fee4ee181cdfec92eb70fcfef76c41d6c373"} Dec 05 06:07:00 crc kubenswrapper[4865]: I1205 06:07:00.803863 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:07:00 crc kubenswrapper[4865]: I1205 06:07:00.836455 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jjpk7" podStartSLOduration=6.174707301 podStartE2EDuration="14.83643011s" podCreationTimestamp="2025-12-05 06:06:46 +0000 UTC" firstStartedPulling="2025-12-05 06:06:47.25648445 +0000 UTC m=+826.536495672" lastFinishedPulling="2025-12-05 06:06:55.918207259 +0000 UTC m=+835.198218481" observedRunningTime="2025-12-05 06:07:00.83608678 +0000 UTC m=+840.116098042" watchObservedRunningTime="2025-12-05 06:07:00.83643011 +0000 UTC m=+840.116441332" Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.708292 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cpbdt"] Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.709468 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.713910 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bgqws" Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.714960 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.715551 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.726027 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cpbdt"] Dec 05 06:07:01 crc kubenswrapper[4865]: I1205 06:07:01.909913 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvkhf\" (UniqueName: \"kubernetes.io/projected/9f209fa8-1faa-484b-8aa1-c3a49922a70a-kube-api-access-mvkhf\") pod \"openstack-operator-index-cpbdt\" (UID: \"9f209fa8-1faa-484b-8aa1-c3a49922a70a\") " pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.010683 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvkhf\" (UniqueName: \"kubernetes.io/projected/9f209fa8-1faa-484b-8aa1-c3a49922a70a-kube-api-access-mvkhf\") pod \"openstack-operator-index-cpbdt\" (UID: \"9f209fa8-1faa-484b-8aa1-c3a49922a70a\") " pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.038053 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvkhf\" (UniqueName: \"kubernetes.io/projected/9f209fa8-1faa-484b-8aa1-c3a49922a70a-kube-api-access-mvkhf\") pod \"openstack-operator-index-cpbdt\" (UID: \"9f209fa8-1faa-484b-8aa1-c3a49922a70a\") " pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.064666 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.127810 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.337461 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.676580 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cpbdt"] Dec 05 06:07:02 crc kubenswrapper[4865]: I1205 06:07:02.817441 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cpbdt" event={"ID":"9f209fa8-1faa-484b-8aa1-c3a49922a70a","Type":"ContainerStarted","Data":"006b87149e559b97af5b5f6d76f44d1ed39a6cf768211f884c70dcd5cc3d8966"} Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.074119 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cpbdt"] Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.695568 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cgngs"] Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.697061 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.708425 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cgngs"] Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.784796 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbrp\" (UniqueName: \"kubernetes.io/projected/5f11965e-838f-4054-ad28-f25e9ba54596-kube-api-access-zxbrp\") pod \"openstack-operator-index-cgngs\" (UID: \"5f11965e-838f-4054-ad28-f25e9ba54596\") " pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.886300 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbrp\" (UniqueName: \"kubernetes.io/projected/5f11965e-838f-4054-ad28-f25e9ba54596-kube-api-access-zxbrp\") pod \"openstack-operator-index-cgngs\" (UID: \"5f11965e-838f-4054-ad28-f25e9ba54596\") " pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:05 crc kubenswrapper[4865]: I1205 06:07:05.908555 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbrp\" (UniqueName: \"kubernetes.io/projected/5f11965e-838f-4054-ad28-f25e9ba54596-kube-api-access-zxbrp\") pod \"openstack-operator-index-cgngs\" (UID: \"5f11965e-838f-4054-ad28-f25e9ba54596\") " pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:06 crc kubenswrapper[4865]: I1205 06:07:06.021844 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:06 crc kubenswrapper[4865]: I1205 06:07:06.853587 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cpbdt" event={"ID":"9f209fa8-1faa-484b-8aa1-c3a49922a70a","Type":"ContainerStarted","Data":"8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1"} Dec 05 06:07:06 crc kubenswrapper[4865]: I1205 06:07:06.853695 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-cpbdt" podUID="9f209fa8-1faa-484b-8aa1-c3a49922a70a" containerName="registry-server" containerID="cri-o://8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1" gracePeriod=2 Dec 05 06:07:06 crc kubenswrapper[4865]: I1205 06:07:06.879784 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cpbdt" podStartSLOduration=2.054845752 podStartE2EDuration="5.879755694s" podCreationTimestamp="2025-12-05 06:07:01 +0000 UTC" firstStartedPulling="2025-12-05 06:07:02.693052401 +0000 UTC m=+841.973063623" lastFinishedPulling="2025-12-05 06:07:06.517962343 +0000 UTC m=+845.797973565" observedRunningTime="2025-12-05 06:07:06.871581293 +0000 UTC m=+846.151592525" watchObservedRunningTime="2025-12-05 06:07:06.879755694 +0000 UTC m=+846.159766916" Dec 05 06:07:06 crc kubenswrapper[4865]: I1205 06:07:06.937165 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cgngs"] Dec 05 06:07:06 crc kubenswrapper[4865]: W1205 06:07:06.966135 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f11965e_838f_4054_ad28_f25e9ba54596.slice/crio-33aefc863432cf8075c06ee217277b6091d01a57995b1cd48e574b529fb7297f WatchSource:0}: Error finding container 33aefc863432cf8075c06ee217277b6091d01a57995b1cd48e574b529fb7297f: Status 404 returned error can't find the container with id 33aefc863432cf8075c06ee217277b6091d01a57995b1cd48e574b529fb7297f Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.058216 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-vhnps" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.245684 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.408441 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvkhf\" (UniqueName: \"kubernetes.io/projected/9f209fa8-1faa-484b-8aa1-c3a49922a70a-kube-api-access-mvkhf\") pod \"9f209fa8-1faa-484b-8aa1-c3a49922a70a\" (UID: \"9f209fa8-1faa-484b-8aa1-c3a49922a70a\") " Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.415413 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f209fa8-1faa-484b-8aa1-c3a49922a70a-kube-api-access-mvkhf" (OuterVolumeSpecName: "kube-api-access-mvkhf") pod "9f209fa8-1faa-484b-8aa1-c3a49922a70a" (UID: "9f209fa8-1faa-484b-8aa1-c3a49922a70a"). InnerVolumeSpecName "kube-api-access-mvkhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.510163 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvkhf\" (UniqueName: \"kubernetes.io/projected/9f209fa8-1faa-484b-8aa1-c3a49922a70a-kube-api-access-mvkhf\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.797181 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-m74rz" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.861521 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cgngs" event={"ID":"5f11965e-838f-4054-ad28-f25e9ba54596","Type":"ContainerStarted","Data":"502f601ebc8ab8e13f5ea7b3e5418db345f1e0b15dd90e9e718f5c40907596d7"} Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.861572 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cgngs" event={"ID":"5f11965e-838f-4054-ad28-f25e9ba54596","Type":"ContainerStarted","Data":"33aefc863432cf8075c06ee217277b6091d01a57995b1cd48e574b529fb7297f"} Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.863396 4865 generic.go:334] "Generic (PLEG): container finished" podID="9f209fa8-1faa-484b-8aa1-c3a49922a70a" containerID="8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1" exitCode=0 Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.863440 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cpbdt" event={"ID":"9f209fa8-1faa-484b-8aa1-c3a49922a70a","Type":"ContainerDied","Data":"8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1"} Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.863462 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cpbdt" event={"ID":"9f209fa8-1faa-484b-8aa1-c3a49922a70a","Type":"ContainerDied","Data":"006b87149e559b97af5b5f6d76f44d1ed39a6cf768211f884c70dcd5cc3d8966"} Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.863478 4865 scope.go:117] "RemoveContainer" containerID="8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.863496 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cpbdt" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.888345 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cgngs" podStartSLOduration=2.840070351 podStartE2EDuration="2.888328183s" podCreationTimestamp="2025-12-05 06:07:05 +0000 UTC" firstStartedPulling="2025-12-05 06:07:06.970774387 +0000 UTC m=+846.250785619" lastFinishedPulling="2025-12-05 06:07:07.019032209 +0000 UTC m=+846.299043451" observedRunningTime="2025-12-05 06:07:07.884428465 +0000 UTC m=+847.164439687" watchObservedRunningTime="2025-12-05 06:07:07.888328183 +0000 UTC m=+847.168339405" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.895062 4865 scope.go:117] "RemoveContainer" containerID="8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1" Dec 05 06:07:07 crc kubenswrapper[4865]: E1205 06:07:07.895628 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1\": container with ID starting with 8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1 not found: ID does not exist" containerID="8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.895662 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1"} err="failed to get container status \"8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1\": rpc error: code = NotFound desc = could not find container \"8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1\": container with ID starting with 8e71eb5c265368b83a9b838d11d52b245a820ec664f30c023cdafe87b5eaa9a1 not found: ID does not exist" Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.908527 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cpbdt"] Dec 05 06:07:07 crc kubenswrapper[4865]: I1205 06:07:07.913415 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-cpbdt"] Dec 05 06:07:09 crc kubenswrapper[4865]: I1205 06:07:09.017348 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f209fa8-1faa-484b-8aa1-c3a49922a70a" path="/var/lib/kubelet/pods/9f209fa8-1faa-484b-8aa1-c3a49922a70a/volumes" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.098123 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lg9fz"] Dec 05 06:07:11 crc kubenswrapper[4865]: E1205 06:07:11.098516 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f209fa8-1faa-484b-8aa1-c3a49922a70a" containerName="registry-server" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.098537 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f209fa8-1faa-484b-8aa1-c3a49922a70a" containerName="registry-server" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.098776 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f209fa8-1faa-484b-8aa1-c3a49922a70a" containerName="registry-server" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.100508 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.122415 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lg9fz"] Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.169099 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfw5d\" (UniqueName: \"kubernetes.io/projected/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-kube-api-access-tfw5d\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.169470 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-utilities\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.169519 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-catalog-content\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.270943 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-catalog-content\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.271068 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfw5d\" (UniqueName: \"kubernetes.io/projected/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-kube-api-access-tfw5d\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.271120 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-utilities\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.271620 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-utilities\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.271769 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-catalog-content\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.296266 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfw5d\" (UniqueName: \"kubernetes.io/projected/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-kube-api-access-tfw5d\") pod \"certified-operators-lg9fz\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.438397 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:11 crc kubenswrapper[4865]: W1205 06:07:11.753432 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb838e97c_05f0_4a3f_ab2d_4affdc12fa0e.slice/crio-117f3f7e21414345b9caed9c334cef9d8c9054dae9621c875179b6c6f31d8ddc WatchSource:0}: Error finding container 117f3f7e21414345b9caed9c334cef9d8c9054dae9621c875179b6c6f31d8ddc: Status 404 returned error can't find the container with id 117f3f7e21414345b9caed9c334cef9d8c9054dae9621c875179b6c6f31d8ddc Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.763601 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lg9fz"] Dec 05 06:07:11 crc kubenswrapper[4865]: I1205 06:07:11.898775 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerStarted","Data":"117f3f7e21414345b9caed9c334cef9d8c9054dae9621c875179b6c6f31d8ddc"} Dec 05 06:07:12 crc kubenswrapper[4865]: I1205 06:07:12.909059 4865 generic.go:334] "Generic (PLEG): container finished" podID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerID="eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde" exitCode=0 Dec 05 06:07:12 crc kubenswrapper[4865]: I1205 06:07:12.909101 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerDied","Data":"eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde"} Dec 05 06:07:13 crc kubenswrapper[4865]: I1205 06:07:13.919545 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerStarted","Data":"ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069"} Dec 05 06:07:14 crc kubenswrapper[4865]: I1205 06:07:14.926066 4865 generic.go:334] "Generic (PLEG): container finished" podID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerID="ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069" exitCode=0 Dec 05 06:07:14 crc kubenswrapper[4865]: I1205 06:07:14.926166 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerDied","Data":"ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069"} Dec 05 06:07:15 crc kubenswrapper[4865]: I1205 06:07:15.935070 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerStarted","Data":"830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d"} Dec 05 06:07:15 crc kubenswrapper[4865]: I1205 06:07:15.960568 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lg9fz" podStartSLOduration=2.238639775 podStartE2EDuration="4.960544042s" podCreationTimestamp="2025-12-05 06:07:11 +0000 UTC" firstStartedPulling="2025-12-05 06:07:12.911214428 +0000 UTC m=+852.191225650" lastFinishedPulling="2025-12-05 06:07:15.633118695 +0000 UTC m=+854.913129917" observedRunningTime="2025-12-05 06:07:15.955980586 +0000 UTC m=+855.235991798" watchObservedRunningTime="2025-12-05 06:07:15.960544042 +0000 UTC m=+855.240555274" Dec 05 06:07:16 crc kubenswrapper[4865]: I1205 06:07:16.022552 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:16 crc kubenswrapper[4865]: I1205 06:07:16.023031 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:16 crc kubenswrapper[4865]: I1205 06:07:16.060582 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:16 crc kubenswrapper[4865]: I1205 06:07:16.973883 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cgngs" Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.069217 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jjpk7" Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.922312 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9"] Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.924747 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.928547 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-92689" Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.941015 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9"] Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.977532 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-bundle\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.978537 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hk4b\" (UniqueName: \"kubernetes.io/projected/afe32c73-a754-43f9-bc45-1ec0219469d9-kube-api-access-9hk4b\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:17 crc kubenswrapper[4865]: I1205 06:07:17.978664 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-util\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.080625 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-bundle\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.080682 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hk4b\" (UniqueName: \"kubernetes.io/projected/afe32c73-a754-43f9-bc45-1ec0219469d9-kube-api-access-9hk4b\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.080714 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-util\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.082527 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-util\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.082619 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-bundle\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.105604 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hk4b\" (UniqueName: \"kubernetes.io/projected/afe32c73-a754-43f9-bc45-1ec0219469d9-kube-api-access-9hk4b\") pod \"7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.252068 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.531994 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9"] Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.961672 4865 generic.go:334] "Generic (PLEG): container finished" podID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerID="aba4df1681973f3b8d84bee7444713a91cf20aa57085d0c331c48850641479e7" exitCode=0 Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.961774 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" event={"ID":"afe32c73-a754-43f9-bc45-1ec0219469d9","Type":"ContainerDied","Data":"aba4df1681973f3b8d84bee7444713a91cf20aa57085d0c331c48850641479e7"} Dec 05 06:07:18 crc kubenswrapper[4865]: I1205 06:07:18.962077 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" event={"ID":"afe32c73-a754-43f9-bc45-1ec0219469d9","Type":"ContainerStarted","Data":"9a6fceddd283e78530c82845e5d5826703fbd0743eae952ed8abe67977207582"} Dec 05 06:07:20 crc kubenswrapper[4865]: I1205 06:07:20.980286 4865 generic.go:334] "Generic (PLEG): container finished" podID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerID="3349ca563a8dcb372ee55bab83dc4f918ceeac977911dad9520690ded26239ca" exitCode=0 Dec 05 06:07:20 crc kubenswrapper[4865]: I1205 06:07:20.980458 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" event={"ID":"afe32c73-a754-43f9-bc45-1ec0219469d9","Type":"ContainerDied","Data":"3349ca563a8dcb372ee55bab83dc4f918ceeac977911dad9520690ded26239ca"} Dec 05 06:07:21 crc kubenswrapper[4865]: I1205 06:07:21.439492 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:21 crc kubenswrapper[4865]: I1205 06:07:21.440568 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:21 crc kubenswrapper[4865]: I1205 06:07:21.515058 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:21 crc kubenswrapper[4865]: I1205 06:07:21.989895 4865 generic.go:334] "Generic (PLEG): container finished" podID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerID="f404b9b772395ba401b8c874049363c52d39283f647509e5ef9787da05249a7a" exitCode=0 Dec 05 06:07:21 crc kubenswrapper[4865]: I1205 06:07:21.989971 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" event={"ID":"afe32c73-a754-43f9-bc45-1ec0219469d9","Type":"ContainerDied","Data":"f404b9b772395ba401b8c874049363c52d39283f647509e5ef9787da05249a7a"} Dec 05 06:07:22 crc kubenswrapper[4865]: I1205 06:07:22.045663 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.330043 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.402439 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-bundle\") pod \"afe32c73-a754-43f9-bc45-1ec0219469d9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.402601 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-util\") pod \"afe32c73-a754-43f9-bc45-1ec0219469d9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.402703 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hk4b\" (UniqueName: \"kubernetes.io/projected/afe32c73-a754-43f9-bc45-1ec0219469d9-kube-api-access-9hk4b\") pod \"afe32c73-a754-43f9-bc45-1ec0219469d9\" (UID: \"afe32c73-a754-43f9-bc45-1ec0219469d9\") " Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.403311 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-bundle" (OuterVolumeSpecName: "bundle") pod "afe32c73-a754-43f9-bc45-1ec0219469d9" (UID: "afe32c73-a754-43f9-bc45-1ec0219469d9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.413811 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe32c73-a754-43f9-bc45-1ec0219469d9-kube-api-access-9hk4b" (OuterVolumeSpecName: "kube-api-access-9hk4b") pod "afe32c73-a754-43f9-bc45-1ec0219469d9" (UID: "afe32c73-a754-43f9-bc45-1ec0219469d9"). InnerVolumeSpecName "kube-api-access-9hk4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.436057 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-util" (OuterVolumeSpecName: "util") pod "afe32c73-a754-43f9-bc45-1ec0219469d9" (UID: "afe32c73-a754-43f9-bc45-1ec0219469d9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.503814 4865 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-util\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.503885 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hk4b\" (UniqueName: \"kubernetes.io/projected/afe32c73-a754-43f9-bc45-1ec0219469d9-kube-api-access-9hk4b\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:23 crc kubenswrapper[4865]: I1205 06:07:23.503900 4865 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afe32c73-a754-43f9-bc45-1ec0219469d9-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:24 crc kubenswrapper[4865]: I1205 06:07:24.009768 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" event={"ID":"afe32c73-a754-43f9-bc45-1ec0219469d9","Type":"ContainerDied","Data":"9a6fceddd283e78530c82845e5d5826703fbd0743eae952ed8abe67977207582"} Dec 05 06:07:24 crc kubenswrapper[4865]: I1205 06:07:24.009828 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6fceddd283e78530c82845e5d5826703fbd0743eae952ed8abe67977207582" Dec 05 06:07:24 crc kubenswrapper[4865]: I1205 06:07:24.010132 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9" Dec 05 06:07:24 crc kubenswrapper[4865]: I1205 06:07:24.880568 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lg9fz"] Dec 05 06:07:24 crc kubenswrapper[4865]: I1205 06:07:24.881986 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lg9fz" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="registry-server" containerID="cri-o://830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d" gracePeriod=2 Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.842663 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.942432 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-utilities\") pod \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.942534 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfw5d\" (UniqueName: \"kubernetes.io/projected/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-kube-api-access-tfw5d\") pod \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.942648 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-catalog-content\") pod \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\" (UID: \"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e\") " Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.943222 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-utilities" (OuterVolumeSpecName: "utilities") pod "b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" (UID: "b838e97c-05f0-4a3f-ab2d-4affdc12fa0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.961746 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-kube-api-access-tfw5d" (OuterVolumeSpecName: "kube-api-access-tfw5d") pod "b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" (UID: "b838e97c-05f0-4a3f-ab2d-4affdc12fa0e"). InnerVolumeSpecName "kube-api-access-tfw5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.963600 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.963624 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfw5d\" (UniqueName: \"kubernetes.io/projected/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-kube-api-access-tfw5d\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:25 crc kubenswrapper[4865]: I1205 06:07:25.988245 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" (UID: "b838e97c-05f0-4a3f-ab2d-4affdc12fa0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.025278 4865 generic.go:334] "Generic (PLEG): container finished" podID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerID="830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d" exitCode=0 Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.025653 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerDied","Data":"830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d"} Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.025707 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lg9fz" event={"ID":"b838e97c-05f0-4a3f-ab2d-4affdc12fa0e","Type":"ContainerDied","Data":"117f3f7e21414345b9caed9c334cef9d8c9054dae9621c875179b6c6f31d8ddc"} Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.025725 4865 scope.go:117] "RemoveContainer" containerID="830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.025933 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lg9fz" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.045806 4865 scope.go:117] "RemoveContainer" containerID="ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.064810 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lg9fz"] Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.065240 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.071890 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lg9fz"] Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.082514 4865 scope.go:117] "RemoveContainer" containerID="eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.104645 4865 scope.go:117] "RemoveContainer" containerID="830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d" Dec 05 06:07:26 crc kubenswrapper[4865]: E1205 06:07:26.105033 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d\": container with ID starting with 830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d not found: ID does not exist" containerID="830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.105075 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d"} err="failed to get container status \"830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d\": rpc error: code = NotFound desc = could not find container \"830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d\": container with ID starting with 830f1fb30b1e50fd0e83fcc22f24b205366d820d0df726934d8d8e6ee770db2d not found: ID does not exist" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.105102 4865 scope.go:117] "RemoveContainer" containerID="ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069" Dec 05 06:07:26 crc kubenswrapper[4865]: E1205 06:07:26.105648 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069\": container with ID starting with ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069 not found: ID does not exist" containerID="ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.105698 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069"} err="failed to get container status \"ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069\": rpc error: code = NotFound desc = could not find container \"ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069\": container with ID starting with ffa3bfefcd196a72f0d47b6b8c6a57e141f479f7e4c9466cbf33432a5f5b3069 not found: ID does not exist" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.105722 4865 scope.go:117] "RemoveContainer" containerID="eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde" Dec 05 06:07:26 crc kubenswrapper[4865]: E1205 06:07:26.106014 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde\": container with ID starting with eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde not found: ID does not exist" containerID="eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde" Dec 05 06:07:26 crc kubenswrapper[4865]: I1205 06:07:26.106074 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde"} err="failed to get container status \"eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde\": rpc error: code = NotFound desc = could not find container \"eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde\": container with ID starting with eb1833bad24f9b8bdd7e446b4eea186d2bfa7c9a25d3847d7eea620849789fde not found: ID does not exist" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.015336 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" path="/var/lib/kubelet/pods/b838e97c-05f0-4a3f-ab2d-4affdc12fa0e/volumes" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.488556 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48pxj"] Dec 05 06:07:27 crc kubenswrapper[4865]: E1205 06:07:27.488999 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="extract-utilities" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489028 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="extract-utilities" Dec 05 06:07:27 crc kubenswrapper[4865]: E1205 06:07:27.489055 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="extract-content" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489068 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="extract-content" Dec 05 06:07:27 crc kubenswrapper[4865]: E1205 06:07:27.489086 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="util" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489100 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="util" Dec 05 06:07:27 crc kubenswrapper[4865]: E1205 06:07:27.489121 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="registry-server" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489133 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="registry-server" Dec 05 06:07:27 crc kubenswrapper[4865]: E1205 06:07:27.489154 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="pull" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489166 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="pull" Dec 05 06:07:27 crc kubenswrapper[4865]: E1205 06:07:27.489191 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="extract" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489203 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="extract" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489407 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b838e97c-05f0-4a3f-ab2d-4affdc12fa0e" containerName="registry-server" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.489436 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe32c73-a754-43f9-bc45-1ec0219469d9" containerName="extract" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.490959 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.502935 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48pxj"] Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.586568 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-catalog-content\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.586699 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-utilities\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.586738 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dw4h\" (UniqueName: \"kubernetes.io/projected/0430da8d-76b8-4bbb-8530-607b537dc3b4-kube-api-access-5dw4h\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.688671 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-catalog-content\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.689086 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-utilities\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.689201 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dw4h\" (UniqueName: \"kubernetes.io/projected/0430da8d-76b8-4bbb-8530-607b537dc3b4-kube-api-access-5dw4h\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.690416 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-catalog-content\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.690456 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-utilities\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.714906 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dw4h\" (UniqueName: \"kubernetes.io/projected/0430da8d-76b8-4bbb-8530-607b537dc3b4-kube-api-access-5dw4h\") pod \"redhat-marketplace-48pxj\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:27 crc kubenswrapper[4865]: I1205 06:07:27.810614 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.298974 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48pxj"] Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.592757 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk"] Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.593788 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.596933 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-l5pw2" Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.632255 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk"] Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.702382 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhpsd\" (UniqueName: \"kubernetes.io/projected/7f835712-3e64-4461-89e1-4eac5548bff5-kube-api-access-mhpsd\") pod \"openstack-operator-controller-operator-554dbdfbd5-l48sk\" (UID: \"7f835712-3e64-4461-89e1-4eac5548bff5\") " pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.804049 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhpsd\" (UniqueName: \"kubernetes.io/projected/7f835712-3e64-4461-89e1-4eac5548bff5-kube-api-access-mhpsd\") pod \"openstack-operator-controller-operator-554dbdfbd5-l48sk\" (UID: \"7f835712-3e64-4461-89e1-4eac5548bff5\") " pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.828286 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhpsd\" (UniqueName: \"kubernetes.io/projected/7f835712-3e64-4461-89e1-4eac5548bff5-kube-api-access-mhpsd\") pod \"openstack-operator-controller-operator-554dbdfbd5-l48sk\" (UID: \"7f835712-3e64-4461-89e1-4eac5548bff5\") " pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:28 crc kubenswrapper[4865]: I1205 06:07:28.911248 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:29 crc kubenswrapper[4865]: I1205 06:07:29.072749 4865 generic.go:334] "Generic (PLEG): container finished" podID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerID="fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8" exitCode=0 Dec 05 06:07:29 crc kubenswrapper[4865]: I1205 06:07:29.073211 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48pxj" event={"ID":"0430da8d-76b8-4bbb-8530-607b537dc3b4","Type":"ContainerDied","Data":"fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8"} Dec 05 06:07:29 crc kubenswrapper[4865]: I1205 06:07:29.073251 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48pxj" event={"ID":"0430da8d-76b8-4bbb-8530-607b537dc3b4","Type":"ContainerStarted","Data":"ed88c25d9d17480bff8fe0344472d18810ad8be3d74f2fa3067fc38c5a4c0e3c"} Dec 05 06:07:29 crc kubenswrapper[4865]: I1205 06:07:29.223162 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk"] Dec 05 06:07:29 crc kubenswrapper[4865]: W1205 06:07:29.243006 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f835712_3e64_4461_89e1_4eac5548bff5.slice/crio-b7f8cd2018ea2a5638c0e217e14bdc4ee3461973520ca35db2fd1400f0cd6590 WatchSource:0}: Error finding container b7f8cd2018ea2a5638c0e217e14bdc4ee3461973520ca35db2fd1400f0cd6590: Status 404 returned error can't find the container with id b7f8cd2018ea2a5638c0e217e14bdc4ee3461973520ca35db2fd1400f0cd6590 Dec 05 06:07:30 crc kubenswrapper[4865]: I1205 06:07:30.085158 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" event={"ID":"7f835712-3e64-4461-89e1-4eac5548bff5","Type":"ContainerStarted","Data":"b7f8cd2018ea2a5638c0e217e14bdc4ee3461973520ca35db2fd1400f0cd6590"} Dec 05 06:07:30 crc kubenswrapper[4865]: I1205 06:07:30.088951 4865 generic.go:334] "Generic (PLEG): container finished" podID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerID="6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc" exitCode=0 Dec 05 06:07:30 crc kubenswrapper[4865]: I1205 06:07:30.088986 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48pxj" event={"ID":"0430da8d-76b8-4bbb-8530-607b537dc3b4","Type":"ContainerDied","Data":"6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc"} Dec 05 06:07:31 crc kubenswrapper[4865]: I1205 06:07:31.110296 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48pxj" event={"ID":"0430da8d-76b8-4bbb-8530-607b537dc3b4","Type":"ContainerStarted","Data":"737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3"} Dec 05 06:07:31 crc kubenswrapper[4865]: I1205 06:07:31.136798 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48pxj" podStartSLOduration=2.716035836 podStartE2EDuration="4.135351419s" podCreationTimestamp="2025-12-05 06:07:27 +0000 UTC" firstStartedPulling="2025-12-05 06:07:29.080268158 +0000 UTC m=+868.360279380" lastFinishedPulling="2025-12-05 06:07:30.499583741 +0000 UTC m=+869.779594963" observedRunningTime="2025-12-05 06:07:31.134390573 +0000 UTC m=+870.414401795" watchObservedRunningTime="2025-12-05 06:07:31.135351419 +0000 UTC m=+870.415362631" Dec 05 06:07:36 crc kubenswrapper[4865]: I1205 06:07:36.149066 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" event={"ID":"7f835712-3e64-4461-89e1-4eac5548bff5","Type":"ContainerStarted","Data":"d8be897d75f3a8753fc4f98ecfcb646e3ed16426d66daccacf862473f9a326e0"} Dec 05 06:07:36 crc kubenswrapper[4865]: I1205 06:07:36.149586 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:36 crc kubenswrapper[4865]: I1205 06:07:36.182059 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" podStartSLOduration=2.243064742 podStartE2EDuration="8.182041972s" podCreationTimestamp="2025-12-05 06:07:28 +0000 UTC" firstStartedPulling="2025-12-05 06:07:29.24518789 +0000 UTC m=+868.525199112" lastFinishedPulling="2025-12-05 06:07:35.18416512 +0000 UTC m=+874.464176342" observedRunningTime="2025-12-05 06:07:36.177561498 +0000 UTC m=+875.457572720" watchObservedRunningTime="2025-12-05 06:07:36.182041972 +0000 UTC m=+875.462053194" Dec 05 06:07:37 crc kubenswrapper[4865]: I1205 06:07:37.813576 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:37 crc kubenswrapper[4865]: I1205 06:07:37.814044 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:37 crc kubenswrapper[4865]: I1205 06:07:37.873912 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:38 crc kubenswrapper[4865]: I1205 06:07:38.204779 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:38 crc kubenswrapper[4865]: I1205 06:07:38.675236 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48pxj"] Dec 05 06:07:40 crc kubenswrapper[4865]: I1205 06:07:40.178303 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-48pxj" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="registry-server" containerID="cri-o://737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3" gracePeriod=2 Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.049406 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.049747 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.053351 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.186518 4865 generic.go:334] "Generic (PLEG): container finished" podID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerID="737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3" exitCode=0 Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.186561 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48pxj" event={"ID":"0430da8d-76b8-4bbb-8530-607b537dc3b4","Type":"ContainerDied","Data":"737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3"} Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.186577 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48pxj" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.186598 4865 scope.go:117] "RemoveContainer" containerID="737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.186588 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48pxj" event={"ID":"0430da8d-76b8-4bbb-8530-607b537dc3b4","Type":"ContainerDied","Data":"ed88c25d9d17480bff8fe0344472d18810ad8be3d74f2fa3067fc38c5a4c0e3c"} Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.204523 4865 scope.go:117] "RemoveContainer" containerID="6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.220989 4865 scope.go:117] "RemoveContainer" containerID="fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.222036 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-catalog-content\") pod \"0430da8d-76b8-4bbb-8530-607b537dc3b4\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.222190 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dw4h\" (UniqueName: \"kubernetes.io/projected/0430da8d-76b8-4bbb-8530-607b537dc3b4-kube-api-access-5dw4h\") pod \"0430da8d-76b8-4bbb-8530-607b537dc3b4\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.222265 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-utilities\") pod \"0430da8d-76b8-4bbb-8530-607b537dc3b4\" (UID: \"0430da8d-76b8-4bbb-8530-607b537dc3b4\") " Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.223160 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-utilities" (OuterVolumeSpecName: "utilities") pod "0430da8d-76b8-4bbb-8530-607b537dc3b4" (UID: "0430da8d-76b8-4bbb-8530-607b537dc3b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.230361 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0430da8d-76b8-4bbb-8530-607b537dc3b4-kube-api-access-5dw4h" (OuterVolumeSpecName: "kube-api-access-5dw4h") pod "0430da8d-76b8-4bbb-8530-607b537dc3b4" (UID: "0430da8d-76b8-4bbb-8530-607b537dc3b4"). InnerVolumeSpecName "kube-api-access-5dw4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.245942 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0430da8d-76b8-4bbb-8530-607b537dc3b4" (UID: "0430da8d-76b8-4bbb-8530-607b537dc3b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.278975 4865 scope.go:117] "RemoveContainer" containerID="737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3" Dec 05 06:07:41 crc kubenswrapper[4865]: E1205 06:07:41.279339 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3\": container with ID starting with 737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3 not found: ID does not exist" containerID="737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.279389 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3"} err="failed to get container status \"737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3\": rpc error: code = NotFound desc = could not find container \"737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3\": container with ID starting with 737db798b96e575c359f6c8d65ed3f3e35e6bfa143b98c97de61b2f5e140f2b3 not found: ID does not exist" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.279412 4865 scope.go:117] "RemoveContainer" containerID="6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc" Dec 05 06:07:41 crc kubenswrapper[4865]: E1205 06:07:41.279660 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc\": container with ID starting with 6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc not found: ID does not exist" containerID="6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.279701 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc"} err="failed to get container status \"6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc\": rpc error: code = NotFound desc = could not find container \"6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc\": container with ID starting with 6e03a15f69ac2015a44c8e06df71f78f4dcbed95d6b87d7d75b8cab6953e09bc not found: ID does not exist" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.279717 4865 scope.go:117] "RemoveContainer" containerID="fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8" Dec 05 06:07:41 crc kubenswrapper[4865]: E1205 06:07:41.279996 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8\": container with ID starting with fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8 not found: ID does not exist" containerID="fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.280016 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8"} err="failed to get container status \"fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8\": rpc error: code = NotFound desc = could not find container \"fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8\": container with ID starting with fd98a4dfa884ec16b34d56459124c7c6f1a2c77eb3ed22bbecf2c4dd6c97b0d8 not found: ID does not exist" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.324354 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.324419 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dw4h\" (UniqueName: \"kubernetes.io/projected/0430da8d-76b8-4bbb-8530-607b537dc3b4-kube-api-access-5dw4h\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.324436 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0430da8d-76b8-4bbb-8530-607b537dc3b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.525943 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48pxj"] Dec 05 06:07:41 crc kubenswrapper[4865]: I1205 06:07:41.530220 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-48pxj"] Dec 05 06:07:43 crc kubenswrapper[4865]: I1205 06:07:43.019087 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" path="/var/lib/kubelet/pods/0430da8d-76b8-4bbb-8530-607b537dc3b4/volumes" Dec 05 06:07:48 crc kubenswrapper[4865]: I1205 06:07:48.913977 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-554dbdfbd5-l48sk" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.760035 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djdgl"] Dec 05 06:07:53 crc kubenswrapper[4865]: E1205 06:07:53.760746 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="extract-utilities" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.760758 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="extract-utilities" Dec 05 06:07:53 crc kubenswrapper[4865]: E1205 06:07:53.760771 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="registry-server" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.760777 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="registry-server" Dec 05 06:07:53 crc kubenswrapper[4865]: E1205 06:07:53.760785 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="extract-content" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.760794 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="extract-content" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.760971 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0430da8d-76b8-4bbb-8530-607b537dc3b4" containerName="registry-server" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.762107 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.792112 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djdgl"] Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.797425 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-catalog-content\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.797908 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k9z2\" (UniqueName: \"kubernetes.io/projected/75b2e75d-6906-4a48-953d-647cbc08256d-kube-api-access-2k9z2\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.798045 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-utilities\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.898699 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k9z2\" (UniqueName: \"kubernetes.io/projected/75b2e75d-6906-4a48-953d-647cbc08256d-kube-api-access-2k9z2\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.898754 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-utilities\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.898804 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-catalog-content\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.899236 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-catalog-content\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.899538 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-utilities\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:53 crc kubenswrapper[4865]: I1205 06:07:53.930302 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k9z2\" (UniqueName: \"kubernetes.io/projected/75b2e75d-6906-4a48-953d-647cbc08256d-kube-api-access-2k9z2\") pod \"community-operators-djdgl\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:54 crc kubenswrapper[4865]: I1205 06:07:54.078363 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:07:54 crc kubenswrapper[4865]: I1205 06:07:54.666496 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djdgl"] Dec 05 06:07:55 crc kubenswrapper[4865]: I1205 06:07:55.268971 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerStarted","Data":"555c3f34baca8f16f87497efc6c143ab25583791d8e2aa9df0236b72897422f6"} Dec 05 06:07:57 crc kubenswrapper[4865]: I1205 06:07:57.281884 4865 generic.go:334] "Generic (PLEG): container finished" podID="75b2e75d-6906-4a48-953d-647cbc08256d" containerID="9a52c18dba41bf3eef5a55c9e8ba9a08e70f201c658d62ffd552c48c1862c0f8" exitCode=0 Dec 05 06:07:57 crc kubenswrapper[4865]: I1205 06:07:57.281960 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerDied","Data":"9a52c18dba41bf3eef5a55c9e8ba9a08e70f201c658d62ffd552c48c1862c0f8"} Dec 05 06:07:58 crc kubenswrapper[4865]: I1205 06:07:58.290580 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerStarted","Data":"3171f15144a8f30e68d77a4596e050f2feb48e291d960355e9f55537724a43d8"} Dec 05 06:07:59 crc kubenswrapper[4865]: I1205 06:07:59.298687 4865 generic.go:334] "Generic (PLEG): container finished" podID="75b2e75d-6906-4a48-953d-647cbc08256d" containerID="3171f15144a8f30e68d77a4596e050f2feb48e291d960355e9f55537724a43d8" exitCode=0 Dec 05 06:07:59 crc kubenswrapper[4865]: I1205 06:07:59.298776 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerDied","Data":"3171f15144a8f30e68d77a4596e050f2feb48e291d960355e9f55537724a43d8"} Dec 05 06:08:01 crc kubenswrapper[4865]: I1205 06:08:01.311372 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerStarted","Data":"0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7"} Dec 05 06:08:04 crc kubenswrapper[4865]: I1205 06:08:04.079524 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:08:04 crc kubenswrapper[4865]: I1205 06:08:04.079861 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:08:04 crc kubenswrapper[4865]: I1205 06:08:04.127092 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:08:04 crc kubenswrapper[4865]: I1205 06:08:04.165111 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djdgl" podStartSLOduration=8.313440199 podStartE2EDuration="11.165092087s" podCreationTimestamp="2025-12-05 06:07:53 +0000 UTC" firstStartedPulling="2025-12-05 06:07:57.283936662 +0000 UTC m=+896.563947884" lastFinishedPulling="2025-12-05 06:08:00.13558855 +0000 UTC m=+899.415599772" observedRunningTime="2025-12-05 06:08:01.345275298 +0000 UTC m=+900.625286530" watchObservedRunningTime="2025-12-05 06:08:04.165092087 +0000 UTC m=+903.445103309" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.773964 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.775923 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.778957 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-gl7nn" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.782446 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.785444 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.792301 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjfp\" (UniqueName: \"kubernetes.io/projected/59231c2f-740e-4c04-af17-53dab82b3497-kube-api-access-cwjfp\") pod \"barbican-operator-controller-manager-7d9dfd778-7jl8d\" (UID: \"59231c2f-740e-4c04-af17-53dab82b3497\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.796749 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-sr9m2" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.806612 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.817852 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.839572 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.840508 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.853509 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.854711 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.855795 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4dlg2" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.856707 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-nzln9" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.893990 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcw8n\" (UniqueName: \"kubernetes.io/projected/30f6dc0d-1962-42c0-a128-d7a54943d849-kube-api-access-xcw8n\") pod \"designate-operator-controller-manager-697fb699cf-j6st7\" (UID: \"30f6dc0d-1962-42c0-a128-d7a54943d849\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.894404 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btkn\" (UniqueName: \"kubernetes.io/projected/a44f8567-c35d-4bf4-be5c-ffbde539bb3a-kube-api-access-9btkn\") pod \"glance-operator-controller-manager-5697bb5779-7wrx8\" (UID: \"a44f8567-c35d-4bf4-be5c-ffbde539bb3a\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.894997 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlzh\" (UniqueName: \"kubernetes.io/projected/f4fc5327-1468-48aa-9a51-e8be8bfb5629-kube-api-access-whlzh\") pod \"cinder-operator-controller-manager-6c677c69b-97l79\" (UID: \"f4fc5327-1468-48aa-9a51-e8be8bfb5629\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.895125 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjfp\" (UniqueName: \"kubernetes.io/projected/59231c2f-740e-4c04-af17-53dab82b3497-kube-api-access-cwjfp\") pod \"barbican-operator-controller-manager-7d9dfd778-7jl8d\" (UID: \"59231c2f-740e-4c04-af17-53dab82b3497\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.944670 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.946779 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjfp\" (UniqueName: \"kubernetes.io/projected/59231c2f-740e-4c04-af17-53dab82b3497-kube-api-access-cwjfp\") pod \"barbican-operator-controller-manager-7d9dfd778-7jl8d\" (UID: \"59231c2f-740e-4c04-af17-53dab82b3497\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.954972 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.960217 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-c5t86" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.983795 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd"] Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.999344 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9btkn\" (UniqueName: \"kubernetes.io/projected/a44f8567-c35d-4bf4-be5c-ffbde539bb3a-kube-api-access-9btkn\") pod \"glance-operator-controller-manager-5697bb5779-7wrx8\" (UID: \"a44f8567-c35d-4bf4-be5c-ffbde539bb3a\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.999393 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plr44\" (UniqueName: \"kubernetes.io/projected/87bce1fb-16c2-4c47-aa02-3f94aa681b58-kube-api-access-plr44\") pod \"heat-operator-controller-manager-5f64f6f8bb-kkstd\" (UID: \"87bce1fb-16c2-4c47-aa02-3f94aa681b58\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.999466 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whlzh\" (UniqueName: \"kubernetes.io/projected/f4fc5327-1468-48aa-9a51-e8be8bfb5629-kube-api-access-whlzh\") pod \"cinder-operator-controller-manager-6c677c69b-97l79\" (UID: \"f4fc5327-1468-48aa-9a51-e8be8bfb5629\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.999501 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcw8n\" (UniqueName: \"kubernetes.io/projected/30f6dc0d-1962-42c0-a128-d7a54943d849-kube-api-access-xcw8n\") pod \"designate-operator-controller-manager-697fb699cf-j6st7\" (UID: \"30f6dc0d-1962-42c0-a128-d7a54943d849\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:08:05 crc kubenswrapper[4865]: I1205 06:08:05.999868 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.019533 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.033701 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.054340 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.060069 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-x44ts" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.076600 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.077918 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.079726 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlzh\" (UniqueName: \"kubernetes.io/projected/f4fc5327-1468-48aa-9a51-e8be8bfb5629-kube-api-access-whlzh\") pod \"cinder-operator-controller-manager-6c677c69b-97l79\" (UID: \"f4fc5327-1468-48aa-9a51-e8be8bfb5629\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.099432 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.100622 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btkn\" (UniqueName: \"kubernetes.io/projected/a44f8567-c35d-4bf4-be5c-ffbde539bb3a-kube-api-access-9btkn\") pod \"glance-operator-controller-manager-5697bb5779-7wrx8\" (UID: \"a44f8567-c35d-4bf4-be5c-ffbde539bb3a\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.103668 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.103992 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hxwrb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.109495 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcw8n\" (UniqueName: \"kubernetes.io/projected/30f6dc0d-1962-42c0-a128-d7a54943d849-kube-api-access-xcw8n\") pod \"designate-operator-controller-manager-697fb699cf-j6st7\" (UID: \"30f6dc0d-1962-42c0-a128-d7a54943d849\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.112433 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6fzr\" (UniqueName: \"kubernetes.io/projected/db94fe25-0c93-4471-852d-45b20c0f266c-kube-api-access-x6fzr\") pod \"horizon-operator-controller-manager-68c6d99b8f-8zkdr\" (UID: \"db94fe25-0c93-4471-852d-45b20c0f266c\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.113740 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.114262 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plr44\" (UniqueName: \"kubernetes.io/projected/87bce1fb-16c2-4c47-aa02-3f94aa681b58-kube-api-access-plr44\") pod \"heat-operator-controller-manager-5f64f6f8bb-kkstd\" (UID: \"87bce1fb-16c2-4c47-aa02-3f94aa681b58\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.114414 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz2fh\" (UniqueName: \"kubernetes.io/projected/e13948be-6623-4815-af50-6e2b5ee807ba-kube-api-access-kz2fh\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.112930 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.140333 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-r8f45"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.141639 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.167898 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.169015 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.177489 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-tjfwh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.177958 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-z5g2x" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.183581 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.204395 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.218081 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6fzr\" (UniqueName: \"kubernetes.io/projected/db94fe25-0c93-4471-852d-45b20c0f266c-kube-api-access-x6fzr\") pod \"horizon-operator-controller-manager-68c6d99b8f-8zkdr\" (UID: \"db94fe25-0c93-4471-852d-45b20c0f266c\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.219312 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.229576 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.232097 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.234310 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz2fh\" (UniqueName: \"kubernetes.io/projected/e13948be-6623-4815-af50-6e2b5ee807ba-kube-api-access-kz2fh\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.234944 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knw2l\" (UniqueName: \"kubernetes.io/projected/8d67bcae-4ae9-4545-8410-236efec0cc30-kube-api-access-knw2l\") pod \"ironic-operator-controller-manager-967d97867-r8f45\" (UID: \"8d67bcae-4ae9-4545-8410-236efec0cc30\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.235097 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbh97\" (UniqueName: \"kubernetes.io/projected/c3d9f2e6-7658-4f43-8d62-72bd4305c06a-kube-api-access-jbh97\") pod \"keystone-operator-controller-manager-7765d96ddf-mlcgh\" (UID: \"c3d9f2e6-7658-4f43-8d62-72bd4305c06a\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:08:06 crc kubenswrapper[4865]: E1205 06:08:06.232624 4865 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:06 crc kubenswrapper[4865]: E1205 06:08:06.235411 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert podName:e13948be-6623-4815-af50-6e2b5ee807ba nodeName:}" failed. No retries permitted until 2025-12-05 06:08:06.735387319 +0000 UTC m=+906.015398541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert") pod "infra-operator-controller-manager-758b7cbd9c-d2qcb" (UID: "e13948be-6623-4815-af50-6e2b5ee807ba") : secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.250899 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.268626 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plr44\" (UniqueName: \"kubernetes.io/projected/87bce1fb-16c2-4c47-aa02-3f94aa681b58-kube-api-access-plr44\") pod \"heat-operator-controller-manager-5f64f6f8bb-kkstd\" (UID: \"87bce1fb-16c2-4c47-aa02-3f94aa681b58\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.346419 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knw2l\" (UniqueName: \"kubernetes.io/projected/8d67bcae-4ae9-4545-8410-236efec0cc30-kube-api-access-knw2l\") pod \"ironic-operator-controller-manager-967d97867-r8f45\" (UID: \"8d67bcae-4ae9-4545-8410-236efec0cc30\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.346485 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbh97\" (UniqueName: \"kubernetes.io/projected/c3d9f2e6-7658-4f43-8d62-72bd4305c06a-kube-api-access-jbh97\") pod \"keystone-operator-controller-manager-7765d96ddf-mlcgh\" (UID: \"c3d9f2e6-7658-4f43-8d62-72bd4305c06a\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.347740 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6fzr\" (UniqueName: \"kubernetes.io/projected/db94fe25-0c93-4471-852d-45b20c0f266c-kube-api-access-x6fzr\") pod \"horizon-operator-controller-manager-68c6d99b8f-8zkdr\" (UID: \"db94fe25-0c93-4471-852d-45b20c0f266c\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.332722 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.372553 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz2fh\" (UniqueName: \"kubernetes.io/projected/e13948be-6623-4815-af50-6e2b5ee807ba-kube-api-access-kz2fh\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.415615 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knw2l\" (UniqueName: \"kubernetes.io/projected/8d67bcae-4ae9-4545-8410-236efec0cc30-kube-api-access-knw2l\") pod \"ironic-operator-controller-manager-967d97867-r8f45\" (UID: \"8d67bcae-4ae9-4545-8410-236efec0cc30\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.431039 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-r8f45"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.431574 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbh97\" (UniqueName: \"kubernetes.io/projected/c3d9f2e6-7658-4f43-8d62-72bd4305c06a-kube-api-access-jbh97\") pod \"keystone-operator-controller-manager-7765d96ddf-mlcgh\" (UID: \"c3d9f2e6-7658-4f43-8d62-72bd4305c06a\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.481991 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.483505 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.492891 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.495231 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rbj7x" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.535333 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.536548 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.542236 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8w6gl" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.560298 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.562613 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.585913 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.587229 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-grcb2" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.602349 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.617686 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.656090 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmlg\" (UniqueName: \"kubernetes.io/projected/2364f477-be51-4698-914a-94d0fd2dd983-kube-api-access-5pmlg\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-v25kd\" (UID: \"2364f477-be51-4698-914a-94d0fd2dd983\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.656354 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8gnt\" (UniqueName: \"kubernetes.io/projected/60a54835-3802-4f32-be4f-ea7ace9084f6-kube-api-access-v8gnt\") pod \"manila-operator-controller-manager-7c79b5df47-vjgsh\" (UID: \"60a54835-3802-4f32-be4f-ea7ace9084f6\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.675458 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.676517 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.684364 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5v9lb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.694066 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.739117 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.757647 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmlg\" (UniqueName: \"kubernetes.io/projected/2364f477-be51-4698-914a-94d0fd2dd983-kube-api-access-5pmlg\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-v25kd\" (UID: \"2364f477-be51-4698-914a-94d0fd2dd983\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.757722 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.757745 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8gnt\" (UniqueName: \"kubernetes.io/projected/60a54835-3802-4f32-be4f-ea7ace9084f6-kube-api-access-v8gnt\") pod \"manila-operator-controller-manager-7c79b5df47-vjgsh\" (UID: \"60a54835-3802-4f32-be4f-ea7ace9084f6\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.757777 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfzqm\" (UniqueName: \"kubernetes.io/projected/1bad98dd-eca3-4f98-884a-655e104b2d92-kube-api-access-zfzqm\") pod \"mariadb-operator-controller-manager-79c8c4686c-cv8vc\" (UID: \"1bad98dd-eca3-4f98-884a-655e104b2d92\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:08:06 crc kubenswrapper[4865]: E1205 06:08:06.758204 4865 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:06 crc kubenswrapper[4865]: E1205 06:08:06.758245 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert podName:e13948be-6623-4815-af50-6e2b5ee807ba nodeName:}" failed. No retries permitted until 2025-12-05 06:08:07.75823101 +0000 UTC m=+907.038242232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert") pod "infra-operator-controller-manager-758b7cbd9c-d2qcb" (UID: "e13948be-6623-4815-af50-6e2b5ee807ba") : secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.766852 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.801731 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-4546x"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.807099 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.834408 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xtddf" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.843109 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8gnt\" (UniqueName: \"kubernetes.io/projected/60a54835-3802-4f32-be4f-ea7ace9084f6-kube-api-access-v8gnt\") pod \"manila-operator-controller-manager-7c79b5df47-vjgsh\" (UID: \"60a54835-3802-4f32-be4f-ea7ace9084f6\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.843481 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-4546x"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.861584 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xz2\" (UniqueName: \"kubernetes.io/projected/8e1c4c0e-047b-4727-9435-7192e4f48bea-kube-api-access-52xz2\") pod \"nova-operator-controller-manager-697bc559fc-7nqrp\" (UID: \"8e1c4c0e-047b-4727-9435-7192e4f48bea\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.861726 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfzqm\" (UniqueName: \"kubernetes.io/projected/1bad98dd-eca3-4f98-884a-655e104b2d92-kube-api-access-zfzqm\") pod \"mariadb-operator-controller-manager-79c8c4686c-cv8vc\" (UID: \"1bad98dd-eca3-4f98-884a-655e104b2d92\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.861747 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmlg\" (UniqueName: \"kubernetes.io/projected/2364f477-be51-4698-914a-94d0fd2dd983-kube-api-access-5pmlg\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-v25kd\" (UID: \"2364f477-be51-4698-914a-94d0fd2dd983\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.902204 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.903662 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.916801 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfzqm\" (UniqueName: \"kubernetes.io/projected/1bad98dd-eca3-4f98-884a-655e104b2d92-kube-api-access-zfzqm\") pod \"mariadb-operator-controller-manager-79c8c4686c-cv8vc\" (UID: \"1bad98dd-eca3-4f98-884a-655e104b2d92\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.923567 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.925047 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.930511 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.937171 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6bdqm" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.937756 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.945638 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hrsbx" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.959913 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72"] Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.965418 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bm7d\" (UniqueName: \"kubernetes.io/projected/571eed7b-c231-42db-8acd-8f2efc828947-kube-api-access-7bm7d\") pod \"octavia-operator-controller-manager-998648c74-4546x\" (UID: \"571eed7b-c231-42db-8acd-8f2efc828947\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.965521 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52xz2\" (UniqueName: \"kubernetes.io/projected/8e1c4c0e-047b-4727-9435-7192e4f48bea-kube-api-access-52xz2\") pod \"nova-operator-controller-manager-697bc559fc-7nqrp\" (UID: \"8e1c4c0e-047b-4727-9435-7192e4f48bea\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:08:06 crc kubenswrapper[4865]: I1205 06:08:06.979207 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.000188 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.001695 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.010096 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ms5zl" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.019520 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xz2\" (UniqueName: \"kubernetes.io/projected/8e1c4c0e-047b-4727-9435-7192e4f48bea-kube-api-access-52xz2\") pod \"nova-operator-controller-manager-697bc559fc-7nqrp\" (UID: \"8e1c4c0e-047b-4727-9435-7192e4f48bea\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.058569 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.082949 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zmf4\" (UniqueName: \"kubernetes.io/projected/51ef47f4-9d56-4555-9a53-007c8648651a-kube-api-access-6zmf4\") pod \"ovn-operator-controller-manager-b6456fdb6-2jh72\" (UID: \"51ef47f4-9d56-4555-9a53-007c8648651a\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.083040 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bm7d\" (UniqueName: \"kubernetes.io/projected/571eed7b-c231-42db-8acd-8f2efc828947-kube-api-access-7bm7d\") pod \"octavia-operator-controller-manager-998648c74-4546x\" (UID: \"571eed7b-c231-42db-8acd-8f2efc828947\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.083124 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.083214 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cgp\" (UniqueName: \"kubernetes.io/projected/2d41068d-3439-4a1d-bb73-9d974c281d4c-kube-api-access-74cgp\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.133695 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bm7d\" (UniqueName: \"kubernetes.io/projected/571eed7b-c231-42db-8acd-8f2efc828947-kube-api-access-7bm7d\") pod \"octavia-operator-controller-manager-998648c74-4546x\" (UID: \"571eed7b-c231-42db-8acd-8f2efc828947\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.139092 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.163255 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.179179 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.195141 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zmf4\" (UniqueName: \"kubernetes.io/projected/51ef47f4-9d56-4555-9a53-007c8648651a-kube-api-access-6zmf4\") pod \"ovn-operator-controller-manager-b6456fdb6-2jh72\" (UID: \"51ef47f4-9d56-4555-9a53-007c8648651a\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.195206 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhkxm\" (UniqueName: \"kubernetes.io/projected/1caf6bc1-a2e2-4330-bc4f-1f324ec5de84-kube-api-access-rhkxm\") pod \"placement-operator-controller-manager-78f8948974-cdf4c\" (UID: \"1caf6bc1-a2e2-4330-bc4f-1f324ec5de84\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.195226 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.195273 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cgp\" (UniqueName: \"kubernetes.io/projected/2d41068d-3439-4a1d-bb73-9d974c281d4c-kube-api-access-74cgp\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:07 crc kubenswrapper[4865]: E1205 06:08:07.196094 4865 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:07 crc kubenswrapper[4865]: E1205 06:08:07.196141 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert podName:2d41068d-3439-4a1d-bb73-9d974c281d4c nodeName:}" failed. No retries permitted until 2025-12-05 06:08:07.696127637 +0000 UTC m=+906.976138859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fzpfrb" (UID: "2d41068d-3439-4a1d-bb73-9d974c281d4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.204890 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.237935 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.239388 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.268474 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jwzp7" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.269670 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cgp\" (UniqueName: \"kubernetes.io/projected/2d41068d-3439-4a1d-bb73-9d974c281d4c-kube-api-access-74cgp\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.275781 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.277108 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.293455 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zmf4\" (UniqueName: \"kubernetes.io/projected/51ef47f4-9d56-4555-9a53-007c8648651a-kube-api-access-6zmf4\") pod \"ovn-operator-controller-manager-b6456fdb6-2jh72\" (UID: \"51ef47f4-9d56-4555-9a53-007c8648651a\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.297564 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhkxm\" (UniqueName: \"kubernetes.io/projected/1caf6bc1-a2e2-4330-bc4f-1f324ec5de84-kube-api-access-rhkxm\") pod \"placement-operator-controller-manager-78f8948974-cdf4c\" (UID: \"1caf6bc1-a2e2-4330-bc4f-1f324ec5de84\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.313326 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qmmj2" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.351879 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.446298 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.447134 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhkxm\" (UniqueName: \"kubernetes.io/projected/1caf6bc1-a2e2-4330-bc4f-1f324ec5de84-kube-api-access-rhkxm\") pod \"placement-operator-controller-manager-78f8948974-cdf4c\" (UID: \"1caf6bc1-a2e2-4330-bc4f-1f324ec5de84\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.452425 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr4ls\" (UniqueName: \"kubernetes.io/projected/1363659b-58f9-4f41-800c-863dd656d2b8-kube-api-access-nr4ls\") pod \"telemetry-operator-controller-manager-58d5ff84df-cppdn\" (UID: \"1363659b-58f9-4f41-800c-863dd656d2b8\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.452620 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7q2c\" (UniqueName: \"kubernetes.io/projected/0445a96f-f840-45c4-a1c3-f4455c49b216-kube-api-access-g7q2c\") pod \"swift-operator-controller-manager-9d58d64bc-qphvq\" (UID: \"0445a96f-f840-45c4-a1c3-f4455c49b216\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.514554 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.516036 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.563971 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr4ls\" (UniqueName: \"kubernetes.io/projected/1363659b-58f9-4f41-800c-863dd656d2b8-kube-api-access-nr4ls\") pod \"telemetry-operator-controller-manager-58d5ff84df-cppdn\" (UID: \"1363659b-58f9-4f41-800c-863dd656d2b8\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.564155 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7q2c\" (UniqueName: \"kubernetes.io/projected/0445a96f-f840-45c4-a1c3-f4455c49b216-kube-api-access-g7q2c\") pod \"swift-operator-controller-manager-9d58d64bc-qphvq\" (UID: \"0445a96f-f840-45c4-a1c3-f4455c49b216\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.615771 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.618760 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.656962 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr4ls\" (UniqueName: \"kubernetes.io/projected/1363659b-58f9-4f41-800c-863dd656d2b8-kube-api-access-nr4ls\") pod \"telemetry-operator-controller-manager-58d5ff84df-cppdn\" (UID: \"1363659b-58f9-4f41-800c-863dd656d2b8\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.665167 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.670684 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qldg4" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.681402 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7q2c\" (UniqueName: \"kubernetes.io/projected/0445a96f-f840-45c4-a1c3-f4455c49b216-kube-api-access-g7q2c\") pod \"swift-operator-controller-manager-9d58d64bc-qphvq\" (UID: \"0445a96f-f840-45c4-a1c3-f4455c49b216\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.712619 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.733673 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hw976" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.739783 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.773772 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.774132 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.774319 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nsd5\" (UniqueName: \"kubernetes.io/projected/9b591a19-b272-4a03-8164-c0296161feb7-kube-api-access-2nsd5\") pod \"test-operator-controller-manager-5854674fcc-j5vmw\" (UID: \"9b591a19-b272-4a03-8164-c0296161feb7\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:08:07 crc kubenswrapper[4865]: E1205 06:08:07.774537 4865 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:07 crc kubenswrapper[4865]: E1205 06:08:07.774621 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert podName:2d41068d-3439-4a1d-bb73-9d974c281d4c nodeName:}" failed. No retries permitted until 2025-12-05 06:08:08.774601803 +0000 UTC m=+908.054613025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fzpfrb" (UID: "2d41068d-3439-4a1d-bb73-9d974c281d4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:07 crc kubenswrapper[4865]: E1205 06:08:07.774629 4865 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:07 crc kubenswrapper[4865]: E1205 06:08:07.774733 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert podName:e13948be-6623-4815-af50-6e2b5ee807ba nodeName:}" failed. No retries permitted until 2025-12-05 06:08:09.774703286 +0000 UTC m=+909.054714688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert") pod "infra-operator-controller-manager-758b7cbd9c-d2qcb" (UID: "e13948be-6623-4815-af50-6e2b5ee807ba") : secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.778917 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.845944 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.847769 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.856135 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.856392 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.876800 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhz7\" (UniqueName: \"kubernetes.io/projected/cf1398f2-aa09-45bb-9a98-5fadca999284-kube-api-access-zlhz7\") pod \"watcher-operator-controller-manager-667bd8d554-zzb4b\" (UID: \"cf1398f2-aa09-45bb-9a98-5fadca999284\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.876878 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nsd5\" (UniqueName: \"kubernetes.io/projected/9b591a19-b272-4a03-8164-c0296161feb7-kube-api-access-2nsd5\") pod \"test-operator-controller-manager-5854674fcc-j5vmw\" (UID: \"9b591a19-b272-4a03-8164-c0296161feb7\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.878272 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kfh6j" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.883946 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.920812 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.926313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nsd5\" (UniqueName: \"kubernetes.io/projected/9b591a19-b272-4a03-8164-c0296161feb7-kube-api-access-2nsd5\") pod \"test-operator-controller-manager-5854674fcc-j5vmw\" (UID: \"9b591a19-b272-4a03-8164-c0296161feb7\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.930301 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.944491 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.960412 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s"] Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.962087 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.969397 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zq585" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.979899 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.981307 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jsr\" (UniqueName: \"kubernetes.io/projected/0e3dd976-2c50-4721-a9a3-330c906f0e16-kube-api-access-59jsr\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.981390 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.981434 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhz7\" (UniqueName: \"kubernetes.io/projected/cf1398f2-aa09-45bb-9a98-5fadca999284-kube-api-access-zlhz7\") pod \"watcher-operator-controller-manager-667bd8d554-zzb4b\" (UID: \"cf1398f2-aa09-45bb-9a98-5fadca999284\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.981509 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:07 crc kubenswrapper[4865]: I1205 06:08:07.992413 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s"] Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.030952 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8"] Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.082510 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.082575 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpdk5\" (UniqueName: \"kubernetes.io/projected/c21265ee-9968-411a-9387-f0c3920b3883-kube-api-access-gpdk5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bg55s\" (UID: \"c21265ee-9968-411a-9387-f0c3920b3883\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.082603 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jsr\" (UniqueName: \"kubernetes.io/projected/0e3dd976-2c50-4721-a9a3-330c906f0e16-kube-api-access-59jsr\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.082664 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.100943 4865 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.101027 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:08.601004312 +0000 UTC m=+907.881015534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "metrics-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.114543 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhz7\" (UniqueName: \"kubernetes.io/projected/cf1398f2-aa09-45bb-9a98-5fadca999284-kube-api-access-zlhz7\") pod \"watcher-operator-controller-manager-667bd8d554-zzb4b\" (UID: \"cf1398f2-aa09-45bb-9a98-5fadca999284\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.117096 4865 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.117177 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:08.617153108 +0000 UTC m=+907.897164330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "webhook-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.135529 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7"] Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.190248 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpdk5\" (UniqueName: \"kubernetes.io/projected/c21265ee-9968-411a-9387-f0c3920b3883-kube-api-access-gpdk5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bg55s\" (UID: \"c21265ee-9968-411a-9387-f0c3920b3883\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.236759 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.237894 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jsr\" (UniqueName: \"kubernetes.io/projected/0e3dd976-2c50-4721-a9a3-330c906f0e16-kube-api-access-59jsr\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.263669 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpdk5\" (UniqueName: \"kubernetes.io/projected/c21265ee-9968-411a-9387-f0c3920b3883-kube-api-access-gpdk5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bg55s\" (UID: \"c21265ee-9968-411a-9387-f0c3920b3883\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.296786 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.523154 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" event={"ID":"59231c2f-740e-4c04-af17-53dab82b3497","Type":"ContainerStarted","Data":"eb397ed0ec8d2c227dd74b54d552959f8164a55f5363aad8422df9d468ddf92b"} Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.528838 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" event={"ID":"a44f8567-c35d-4bf4-be5c-ffbde539bb3a","Type":"ContainerStarted","Data":"3a166efe4a534ca01105fb9cce3b00bc87aaeaea6224f91ec4feba9acd66ae83"} Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.542348 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" event={"ID":"30f6dc0d-1962-42c0-a128-d7a54943d849","Type":"ContainerStarted","Data":"0ed93f3984a7040b2358cd4bdee153866c698897de0d9773dececce8a00f1eaf"} Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.604256 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.604789 4865 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.604863 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:09.604848549 +0000 UTC m=+908.884859771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "metrics-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.712175 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.712355 4865 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.712411 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:09.712397497 +0000 UTC m=+908.992408719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "webhook-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.814447 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr"] Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.816676 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.816956 4865 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: E1205 06:08:08.817012 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert podName:2d41068d-3439-4a1d-bb73-9d974c281d4c nodeName:}" failed. No retries permitted until 2025-12-05 06:08:10.816998034 +0000 UTC m=+910.097009256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fzpfrb" (UID: "2d41068d-3439-4a1d-bb73-9d974c281d4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:08 crc kubenswrapper[4865]: W1205 06:08:08.882575 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4fc5327_1468_48aa_9a51_e8be8bfb5629.slice/crio-1cd146b71e9e7003f506db0e042627d4c41d8b338a2f98d929ab5f9533dfb481 WatchSource:0}: Error finding container 1cd146b71e9e7003f506db0e042627d4c41d8b338a2f98d929ab5f9533dfb481: Status 404 returned error can't find the container with id 1cd146b71e9e7003f506db0e042627d4c41d8b338a2f98d929ab5f9533dfb481 Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.891772 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79"] Dec 05 06:08:08 crc kubenswrapper[4865]: W1205 06:08:08.899930 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87bce1fb_16c2_4c47_aa02_3f94aa681b58.slice/crio-879c6363d243b76e9876afcec5c2212892462777ca639a1e22554dba3b93fe2d WatchSource:0}: Error finding container 879c6363d243b76e9876afcec5c2212892462777ca639a1e22554dba3b93fe2d: Status 404 returned error can't find the container with id 879c6363d243b76e9876afcec5c2212892462777ca639a1e22554dba3b93fe2d Dec 05 06:08:08 crc kubenswrapper[4865]: W1205 06:08:08.901267 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d67bcae_4ae9_4545_8410_236efec0cc30.slice/crio-3a8748f2d696cf3818018db7981ffc9fd81221f82708558188450a640b0a45eb WatchSource:0}: Error finding container 3a8748f2d696cf3818018db7981ffc9fd81221f82708558188450a640b0a45eb: Status 404 returned error can't find the container with id 3a8748f2d696cf3818018db7981ffc9fd81221f82708558188450a640b0a45eb Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.911949 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-r8f45"] Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.926815 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd"] Dec 05 06:08:08 crc kubenswrapper[4865]: I1205 06:08:08.934386 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.211034 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.254546 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.273440 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-4546x"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.288490 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.348895 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.378031 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.386813 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp"] Dec 05 06:08:09 crc kubenswrapper[4865]: W1205 06:08:09.400783 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e1c4c0e_047b_4727_9435_7192e4f48bea.slice/crio-838ddf47ab99c39fb1ef141bd36a3b49ab8577cbd6aa7318344bcc5d73121f08 WatchSource:0}: Error finding container 838ddf47ab99c39fb1ef141bd36a3b49ab8577cbd6aa7318344bcc5d73121f08: Status 404 returned error can't find the container with id 838ddf47ab99c39fb1ef141bd36a3b49ab8577cbd6aa7318344bcc5d73121f08 Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.556610 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" event={"ID":"1bad98dd-eca3-4f98-884a-655e104b2d92","Type":"ContainerStarted","Data":"aa01bb9dfa573e471eb2301c545a134d5ecf197d36295259974d3d2b1e167cbb"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.558961 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" event={"ID":"2364f477-be51-4698-914a-94d0fd2dd983","Type":"ContainerStarted","Data":"cfcd78330c37a79875d6dcdec658475f01e3dbd71967268a71c47b390123e3ce"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.560087 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" event={"ID":"60a54835-3802-4f32-be4f-ea7ace9084f6","Type":"ContainerStarted","Data":"64365338f1d16215d2ade9d42916243bd92319cc6a1f2f0c0f8ae8eccae1d871"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.560863 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" event={"ID":"8d67bcae-4ae9-4545-8410-236efec0cc30","Type":"ContainerStarted","Data":"3a8748f2d696cf3818018db7981ffc9fd81221f82708558188450a640b0a45eb"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.585901 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" event={"ID":"8e1c4c0e-047b-4727-9435-7192e4f48bea","Type":"ContainerStarted","Data":"838ddf47ab99c39fb1ef141bd36a3b49ab8577cbd6aa7318344bcc5d73121f08"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.587034 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" event={"ID":"1caf6bc1-a2e2-4330-bc4f-1f324ec5de84","Type":"ContainerStarted","Data":"b804a126abb5372f5875189ebf662f9e34fc63bc948eeccea5e446e54d23f3cb"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.594218 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" event={"ID":"87bce1fb-16c2-4c47-aa02-3f94aa681b58","Type":"ContainerStarted","Data":"879c6363d243b76e9876afcec5c2212892462777ca639a1e22554dba3b93fe2d"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.598487 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" event={"ID":"f4fc5327-1468-48aa-9a51-e8be8bfb5629","Type":"ContainerStarted","Data":"1cd146b71e9e7003f506db0e042627d4c41d8b338a2f98d929ab5f9533dfb481"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.615269 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" event={"ID":"db94fe25-0c93-4471-852d-45b20c0f266c","Type":"ContainerStarted","Data":"ac91c3960de15c1adde25bc842e648a8d5b7b3aa8c540a5b894136b3a924b79e"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.617773 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" event={"ID":"51ef47f4-9d56-4555-9a53-007c8648651a","Type":"ContainerStarted","Data":"1b6fd0b2e92ffb41b521b86ff0060444235bb6d82aa55fbfaa11ac59bd1aaf60"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.621525 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.623581 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" event={"ID":"571eed7b-c231-42db-8acd-8f2efc828947","Type":"ContainerStarted","Data":"9969e449ab29685b8e4b517527ee5823ee21c344a4eeee75a00de071cdae577f"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.628602 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" event={"ID":"c3d9f2e6-7658-4f43-8d62-72bd4305c06a","Type":"ContainerStarted","Data":"cc1baf956e0084f8a62a9a4bc86ad17a74886431c447a0edcf44c20fb4b1f76a"} Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.652496 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.672654 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.673331 4865 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.673410 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:11.673391781 +0000 UTC m=+910.953403003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "metrics-server-cert" not found Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.689442 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn"] Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.709714 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nr4ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-cppdn_openstack-operators(1363659b-58f9-4f41-800c-863dd656d2b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.714480 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nr4ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-cppdn_openstack-operators(1363659b-58f9-4f41-800c-863dd656d2b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.714607 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zlhz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-zzb4b_openstack-operators(cf1398f2-aa09-45bb-9a98-5fadca999284): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.715956 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" podUID="1363659b-58f9-4f41-800c-863dd656d2b8" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.719001 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zlhz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-zzb4b_openstack-operators(cf1398f2-aa09-45bb-9a98-5fadca999284): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.720275 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" podUID="cf1398f2-aa09-45bb-9a98-5fadca999284" Dec 05 06:08:09 crc kubenswrapper[4865]: W1205 06:08:09.721842 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b591a19_b272_4a03_8164_c0296161feb7.slice/crio-6136ed0280b4409b055dd09e091ab2bca558404b1720d7a961d6d0d04264c84e WatchSource:0}: Error finding container 6136ed0280b4409b055dd09e091ab2bca558404b1720d7a961d6d0d04264c84e: Status 404 returned error can't find the container with id 6136ed0280b4409b055dd09e091ab2bca558404b1720d7a961d6d0d04264c84e Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.725935 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq"] Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.733071 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b"] Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.751599 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nsd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-j5vmw_openstack-operators(9b591a19-b272-4a03-8164-c0296161feb7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.754143 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nsd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-j5vmw_openstack-operators(9b591a19-b272-4a03-8164-c0296161feb7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.755412 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" podUID="9b591a19-b272-4a03-8164-c0296161feb7" Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.774787 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.775464 4865 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.775768 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:11.775655454 +0000 UTC m=+911.055666866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "webhook-server-cert" not found Dec 05 06:08:09 crc kubenswrapper[4865]: I1205 06:08:09.877317 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.877539 4865 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:09 crc kubenswrapper[4865]: E1205 06:08:09.877655 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert podName:e13948be-6623-4815-af50-6e2b5ee807ba nodeName:}" failed. No retries permitted until 2025-12-05 06:08:13.877623318 +0000 UTC m=+913.157634700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert") pod "infra-operator-controller-manager-758b7cbd9c-d2qcb" (UID: "e13948be-6623-4815-af50-6e2b5ee807ba") : secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:10 crc kubenswrapper[4865]: I1205 06:08:10.650097 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" event={"ID":"c21265ee-9968-411a-9387-f0c3920b3883","Type":"ContainerStarted","Data":"17ae6789b1ff48d9631dc144142178f7a4b399b07d28b0eac9ffc378c3b752da"} Dec 05 06:08:10 crc kubenswrapper[4865]: I1205 06:08:10.655515 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" event={"ID":"9b591a19-b272-4a03-8164-c0296161feb7","Type":"ContainerStarted","Data":"6136ed0280b4409b055dd09e091ab2bca558404b1720d7a961d6d0d04264c84e"} Dec 05 06:08:10 crc kubenswrapper[4865]: E1205 06:08:10.660525 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" podUID="9b591a19-b272-4a03-8164-c0296161feb7" Dec 05 06:08:10 crc kubenswrapper[4865]: I1205 06:08:10.661265 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" event={"ID":"1363659b-58f9-4f41-800c-863dd656d2b8","Type":"ContainerStarted","Data":"78c8e2320cb05c9fda1951f154d9fa1674d2bfb5be0d708dd270a76058a5bf26"} Dec 05 06:08:10 crc kubenswrapper[4865]: I1205 06:08:10.662562 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" event={"ID":"cf1398f2-aa09-45bb-9a98-5fadca999284","Type":"ContainerStarted","Data":"adb2962d472ce0d40206d1a008a063681c3b03b929cce6ce267c219f94255f3c"} Dec 05 06:08:10 crc kubenswrapper[4865]: E1205 06:08:10.663924 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" podUID="1363659b-58f9-4f41-800c-863dd656d2b8" Dec 05 06:08:10 crc kubenswrapper[4865]: I1205 06:08:10.664917 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" event={"ID":"0445a96f-f840-45c4-a1c3-f4455c49b216","Type":"ContainerStarted","Data":"698c7100d80b54588117c5afbe0f05fa9564dc4245caed07a64725d34d37d842"} Dec 05 06:08:10 crc kubenswrapper[4865]: E1205 06:08:10.666009 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" podUID="cf1398f2-aa09-45bb-9a98-5fadca999284" Dec 05 06:08:10 crc kubenswrapper[4865]: I1205 06:08:10.897938 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:10 crc kubenswrapper[4865]: E1205 06:08:10.898487 4865 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:10 crc kubenswrapper[4865]: E1205 06:08:10.898630 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert podName:2d41068d-3439-4a1d-bb73-9d974c281d4c nodeName:}" failed. No retries permitted until 2025-12-05 06:08:14.898601798 +0000 UTC m=+914.178613020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fzpfrb" (UID: "2d41068d-3439-4a1d-bb73-9d974c281d4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:11 crc kubenswrapper[4865]: I1205 06:08:11.050539 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:08:11 crc kubenswrapper[4865]: I1205 06:08:11.050623 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.707251 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" podUID="cf1398f2-aa09-45bb-9a98-5fadca999284" Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.710117 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" podUID="1363659b-58f9-4f41-800c-863dd656d2b8" Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.710129 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" podUID="9b591a19-b272-4a03-8164-c0296161feb7" Dec 05 06:08:11 crc kubenswrapper[4865]: I1205 06:08:11.728949 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.731271 4865 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.734448 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:15.734428237 +0000 UTC m=+915.014439459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "metrics-server-cert" not found Dec 05 06:08:11 crc kubenswrapper[4865]: I1205 06:08:11.836689 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.837060 4865 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 06:08:11 crc kubenswrapper[4865]: E1205 06:08:11.837146 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:15.837123892 +0000 UTC m=+915.117135114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "webhook-server-cert" not found Dec 05 06:08:13 crc kubenswrapper[4865]: I1205 06:08:13.881643 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:13 crc kubenswrapper[4865]: E1205 06:08:13.881911 4865 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:13 crc kubenswrapper[4865]: E1205 06:08:13.881969 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert podName:e13948be-6623-4815-af50-6e2b5ee807ba nodeName:}" failed. No retries permitted until 2025-12-05 06:08:21.88195416 +0000 UTC m=+921.161965382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert") pod "infra-operator-controller-manager-758b7cbd9c-d2qcb" (UID: "e13948be-6623-4815-af50-6e2b5ee807ba") : secret "infra-operator-webhook-server-cert" not found Dec 05 06:08:14 crc kubenswrapper[4865]: I1205 06:08:14.130082 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:08:14 crc kubenswrapper[4865]: I1205 06:08:14.183354 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djdgl"] Dec 05 06:08:14 crc kubenswrapper[4865]: I1205 06:08:14.730000 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djdgl" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="registry-server" containerID="cri-o://0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" gracePeriod=2 Dec 05 06:08:14 crc kubenswrapper[4865]: I1205 06:08:14.999391 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:15 crc kubenswrapper[4865]: E1205 06:08:14.999571 4865 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:15 crc kubenswrapper[4865]: E1205 06:08:14.999619 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert podName:2d41068d-3439-4a1d-bb73-9d974c281d4c nodeName:}" failed. No retries permitted until 2025-12-05 06:08:22.999605919 +0000 UTC m=+922.279617141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fzpfrb" (UID: "2d41068d-3439-4a1d-bb73-9d974c281d4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 06:08:15 crc kubenswrapper[4865]: I1205 06:08:15.744971 4865 generic.go:334] "Generic (PLEG): container finished" podID="75b2e75d-6906-4a48-953d-647cbc08256d" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" exitCode=0 Dec 05 06:08:15 crc kubenswrapper[4865]: I1205 06:08:15.745013 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerDied","Data":"0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7"} Dec 05 06:08:15 crc kubenswrapper[4865]: I1205 06:08:15.813027 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:15 crc kubenswrapper[4865]: E1205 06:08:15.813186 4865 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 06:08:15 crc kubenswrapper[4865]: E1205 06:08:15.813248 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:23.813233516 +0000 UTC m=+923.093244738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "metrics-server-cert" not found Dec 05 06:08:15 crc kubenswrapper[4865]: I1205 06:08:15.914685 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:15 crc kubenswrapper[4865]: E1205 06:08:15.914929 4865 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 06:08:15 crc kubenswrapper[4865]: E1205 06:08:15.915032 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs podName:0e3dd976-2c50-4721-a9a3-330c906f0e16 nodeName:}" failed. No retries permitted until 2025-12-05 06:08:23.915007285 +0000 UTC m=+923.195018497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs") pod "openstack-operator-controller-manager-6f6696b64-hqh47" (UID: "0e3dd976-2c50-4721-a9a3-330c906f0e16") : secret "webhook-server-cert" not found Dec 05 06:08:21 crc kubenswrapper[4865]: I1205 06:08:21.923598 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:21 crc kubenswrapper[4865]: I1205 06:08:21.932068 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13948be-6623-4815-af50-6e2b5ee807ba-cert\") pod \"infra-operator-controller-manager-758b7cbd9c-d2qcb\" (UID: \"e13948be-6623-4815-af50-6e2b5ee807ba\") " pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:22 crc kubenswrapper[4865]: I1205 06:08:22.143250 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.041458 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.052387 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d41068d-3439-4a1d-bb73-9d974c281d4c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fzpfrb\" (UID: \"2d41068d-3439-4a1d-bb73-9d974c281d4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:23 crc kubenswrapper[4865]: E1205 06:08:23.112456 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 05 06:08:23 crc kubenswrapper[4865]: E1205 06:08:23.113067 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zfzqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-cv8vc_openstack-operators(1bad98dd-eca3-4f98-884a-655e104b2d92): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.341103 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:08:23 crc kubenswrapper[4865]: E1205 06:08:23.697576 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 05 06:08:23 crc kubenswrapper[4865]: E1205 06:08:23.697841 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6zmf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2jh72_openstack-operators(51ef47f4-9d56-4555-9a53-007c8648651a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.854399 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.874137 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-metrics-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.956461 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:23 crc kubenswrapper[4865]: I1205 06:08:23.963004 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e3dd976-2c50-4721-a9a3-330c906f0e16-webhook-certs\") pod \"openstack-operator-controller-manager-6f6696b64-hqh47\" (UID: \"0e3dd976-2c50-4721-a9a3-330c906f0e16\") " pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:24 crc kubenswrapper[4865]: I1205 06:08:24.069370 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:24 crc kubenswrapper[4865]: E1205 06:08:24.079659 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 06:08:24 crc kubenswrapper[4865]: E1205 06:08:24.084994 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 06:08:24 crc kubenswrapper[4865]: E1205 06:08:24.085494 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 06:08:24 crc kubenswrapper[4865]: E1205 06:08:24.085568 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-djdgl" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="registry-server" Dec 05 06:08:29 crc kubenswrapper[4865]: E1205 06:08:29.151668 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 05 06:08:29 crc kubenswrapper[4865]: E1205 06:08:29.152797 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x6fzr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-8zkdr_openstack-operators(db94fe25-0c93-4471-852d-45b20c0f266c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:29 crc kubenswrapper[4865]: E1205 06:08:29.631502 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 05 06:08:29 crc kubenswrapper[4865]: E1205 06:08:29.631783 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knw2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-r8f45_openstack-operators(8d67bcae-4ae9-4545-8410-236efec0cc30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:30 crc kubenswrapper[4865]: E1205 06:08:30.325941 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 05 06:08:30 crc kubenswrapper[4865]: E1205 06:08:30.326140 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7bm7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-4546x_openstack-operators(571eed7b-c231-42db-8acd-8f2efc828947): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:34 crc kubenswrapper[4865]: E1205 06:08:34.079966 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 06:08:34 crc kubenswrapper[4865]: E1205 06:08:34.080954 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 06:08:34 crc kubenswrapper[4865]: E1205 06:08:34.081596 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 06:08:34 crc kubenswrapper[4865]: E1205 06:08:34.081718 4865 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-djdgl" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="registry-server" Dec 05 06:08:36 crc kubenswrapper[4865]: E1205 06:08:36.732069 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 06:08:36 crc kubenswrapper[4865]: E1205 06:08:36.732474 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gpdk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-bg55s_openstack-operators(c21265ee-9968-411a-9387-f0c3920b3883): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:36 crc kubenswrapper[4865]: E1205 06:08:36.734898 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" podUID="c21265ee-9968-411a-9387-f0c3920b3883" Dec 05 06:08:36 crc kubenswrapper[4865]: E1205 06:08:36.924629 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" podUID="c21265ee-9968-411a-9387-f0c3920b3883" Dec 05 06:08:37 crc kubenswrapper[4865]: E1205 06:08:37.317316 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 06:08:37 crc kubenswrapper[4865]: E1205 06:08:37.317802 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5pmlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-v25kd_openstack-operators(2364f477-be51-4698-914a-94d0fd2dd983): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:39 crc kubenswrapper[4865]: E1205 06:08:39.257366 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3" Dec 05 06:08:39 crc kubenswrapper[4865]: E1205 06:08:39.257630 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whlzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-97l79_openstack-operators(f4fc5327-1468-48aa-9a51-e8be8bfb5629): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:39 crc kubenswrapper[4865]: E1205 06:08:39.804641 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 05 06:08:39 crc kubenswrapper[4865]: E1205 06:08:39.804817 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xcw8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-j6st7_openstack-operators(30f6dc0d-1962-42c0-a128-d7a54943d849): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:41 crc kubenswrapper[4865]: I1205 06:08:41.049295 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:08:41 crc kubenswrapper[4865]: I1205 06:08:41.049666 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:08:41 crc kubenswrapper[4865]: I1205 06:08:41.049721 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:08:41 crc kubenswrapper[4865]: I1205 06:08:41.050317 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d163070852eac0c87032f69fbdb534afbbd8e4f78e69ec919b3b74b72f841eab"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:08:41 crc kubenswrapper[4865]: I1205 06:08:41.050365 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://d163070852eac0c87032f69fbdb534afbbd8e4f78e69ec919b3b74b72f841eab" gracePeriod=600 Dec 05 06:08:42 crc kubenswrapper[4865]: E1205 06:08:42.452378 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 05 06:08:42 crc kubenswrapper[4865]: E1205 06:08:42.452638 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-plr44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-kkstd_openstack-operators(87bce1fb-16c2-4c47-aa02-3f94aa681b58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:42 crc kubenswrapper[4865]: E1205 06:08:42.897405 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 05 06:08:42 crc kubenswrapper[4865]: E1205 06:08:42.897662 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7q2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-qphvq_openstack-operators(0445a96f-f840-45c4-a1c3-f4455c49b216): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:42 crc kubenswrapper[4865]: I1205 06:08:42.971301 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="d163070852eac0c87032f69fbdb534afbbd8e4f78e69ec919b3b74b72f841eab" exitCode=0 Dec 05 06:08:42 crc kubenswrapper[4865]: I1205 06:08:42.971350 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"d163070852eac0c87032f69fbdb534afbbd8e4f78e69ec919b3b74b72f841eab"} Dec 05 06:08:42 crc kubenswrapper[4865]: I1205 06:08:42.971396 4865 scope.go:117] "RemoveContainer" containerID="80e4ab6bf8f2776e8ac270a6781a82ac7a7696de67acef027bfa81b854301141" Dec 05 06:08:43 crc kubenswrapper[4865]: E1205 06:08:43.400497 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 06:08:43 crc kubenswrapper[4865]: E1205 06:08:43.400963 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52xz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7nqrp_openstack-operators(8e1c4c0e-047b-4727-9435-7192e4f48bea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.431764 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.481764 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-catalog-content\") pod \"75b2e75d-6906-4a48-953d-647cbc08256d\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.481866 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-utilities\") pod \"75b2e75d-6906-4a48-953d-647cbc08256d\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.481895 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k9z2\" (UniqueName: \"kubernetes.io/projected/75b2e75d-6906-4a48-953d-647cbc08256d-kube-api-access-2k9z2\") pod \"75b2e75d-6906-4a48-953d-647cbc08256d\" (UID: \"75b2e75d-6906-4a48-953d-647cbc08256d\") " Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.482418 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-utilities" (OuterVolumeSpecName: "utilities") pod "75b2e75d-6906-4a48-953d-647cbc08256d" (UID: "75b2e75d-6906-4a48-953d-647cbc08256d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.492577 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b2e75d-6906-4a48-953d-647cbc08256d-kube-api-access-2k9z2" (OuterVolumeSpecName: "kube-api-access-2k9z2") pod "75b2e75d-6906-4a48-953d-647cbc08256d" (UID: "75b2e75d-6906-4a48-953d-647cbc08256d"). InnerVolumeSpecName "kube-api-access-2k9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.539606 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75b2e75d-6906-4a48-953d-647cbc08256d" (UID: "75b2e75d-6906-4a48-953d-647cbc08256d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.584099 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.584146 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b2e75d-6906-4a48-953d-647cbc08256d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.584162 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k9z2\" (UniqueName: \"kubernetes.io/projected/75b2e75d-6906-4a48-953d-647cbc08256d-kube-api-access-2k9z2\") on node \"crc\" DevicePath \"\"" Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.996569 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djdgl" event={"ID":"75b2e75d-6906-4a48-953d-647cbc08256d","Type":"ContainerDied","Data":"555c3f34baca8f16f87497efc6c143ab25583791d8e2aa9df0236b72897422f6"} Dec 05 06:08:43 crc kubenswrapper[4865]: I1205 06:08:43.996719 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djdgl" Dec 05 06:08:44 crc kubenswrapper[4865]: I1205 06:08:44.047177 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djdgl"] Dec 05 06:08:44 crc kubenswrapper[4865]: I1205 06:08:44.053964 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djdgl"] Dec 05 06:08:44 crc kubenswrapper[4865]: E1205 06:08:44.060906 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 06:08:44 crc kubenswrapper[4865]: E1205 06:08:44.061131 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jbh97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-mlcgh_openstack-operators(c3d9f2e6-7658-4f43-8d62-72bd4305c06a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:45 crc kubenswrapper[4865]: I1205 06:08:45.021566 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" path="/var/lib/kubelet/pods/75b2e75d-6906-4a48-953d-647cbc08256d/volumes" Dec 05 06:08:45 crc kubenswrapper[4865]: E1205 06:08:45.312272 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8" Dec 05 06:08:45 crc kubenswrapper[4865]: E1205 06:08:45.312898 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zlhz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-zzb4b_openstack-operators(cf1398f2-aa09-45bb-9a98-5fadca999284): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:46 crc kubenswrapper[4865]: E1205 06:08:46.006563 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 05 06:08:46 crc kubenswrapper[4865]: E1205 06:08:46.006736 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nr4ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-cppdn_openstack-operators(1363659b-58f9-4f41-800c-863dd656d2b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:46 crc kubenswrapper[4865]: E1205 06:08:46.505268 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 05 06:08:46 crc kubenswrapper[4865]: E1205 06:08:46.505502 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nsd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-j5vmw_openstack-operators(9b591a19-b272-4a03-8164-c0296161feb7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:08:47 crc kubenswrapper[4865]: I1205 06:08:47.103647 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb"] Dec 05 06:08:47 crc kubenswrapper[4865]: I1205 06:08:47.170085 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb"] Dec 05 06:08:47 crc kubenswrapper[4865]: I1205 06:08:47.175636 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47"] Dec 05 06:08:47 crc kubenswrapper[4865]: I1205 06:08:47.812359 4865 scope.go:117] "RemoveContainer" containerID="0821163367fe7b498def88423b203e09d69d7cd266cadcd5a9b86e196ad346d7" Dec 05 06:08:47 crc kubenswrapper[4865]: W1205 06:08:47.849604 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d41068d_3439_4a1d_bb73_9d974c281d4c.slice/crio-7f719a5ba392873733cf53753704d4130b8f5a2430b1992d8ae5fb24539f9a05 WatchSource:0}: Error finding container 7f719a5ba392873733cf53753704d4130b8f5a2430b1992d8ae5fb24539f9a05: Status 404 returned error can't find the container with id 7f719a5ba392873733cf53753704d4130b8f5a2430b1992d8ae5fb24539f9a05 Dec 05 06:08:47 crc kubenswrapper[4865]: I1205 06:08:47.968259 4865 scope.go:117] "RemoveContainer" containerID="3171f15144a8f30e68d77a4596e050f2feb48e291d960355e9f55537724a43d8" Dec 05 06:08:48 crc kubenswrapper[4865]: I1205 06:08:48.027346 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" event={"ID":"0e3dd976-2c50-4721-a9a3-330c906f0e16","Type":"ContainerStarted","Data":"2a7327b85436509079af15568f4a096ba3227934226bbd257ee4ace6edcc9d04"} Dec 05 06:08:48 crc kubenswrapper[4865]: I1205 06:08:48.034597 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" event={"ID":"2d41068d-3439-4a1d-bb73-9d974c281d4c","Type":"ContainerStarted","Data":"7f719a5ba392873733cf53753704d4130b8f5a2430b1992d8ae5fb24539f9a05"} Dec 05 06:08:48 crc kubenswrapper[4865]: I1205 06:08:48.037336 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" event={"ID":"e13948be-6623-4815-af50-6e2b5ee807ba","Type":"ContainerStarted","Data":"ce3e0932c6920a94263ea7a7277e40ea6dd57e28508317d2561edbed3424e815"} Dec 05 06:08:48 crc kubenswrapper[4865]: I1205 06:08:48.216531 4865 scope.go:117] "RemoveContainer" containerID="9a52c18dba41bf3eef5a55c9e8ba9a08e70f201c658d62ffd552c48c1862c0f8" Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.059735 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" event={"ID":"1caf6bc1-a2e2-4330-bc4f-1f324ec5de84","Type":"ContainerStarted","Data":"36002b5b2e50f9ed7ac67dbd5e4c3db03cd823623d188793e26eb5151862d6a5"} Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.069211 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" event={"ID":"59231c2f-740e-4c04-af17-53dab82b3497","Type":"ContainerStarted","Data":"9963730da90d59936b26b9282c1f138227192406fb52856b8f36dabbbdcc7a1d"} Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.071446 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" event={"ID":"0e3dd976-2c50-4721-a9a3-330c906f0e16","Type":"ContainerStarted","Data":"69e4ead2cb738ea8b86e82901bf6ca3e507dc090344516864e561a98078a1142"} Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.072408 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.079450 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"20536395e22903e5ca8dee5d63c34f131b2da1d0f9f86ec93b930a0c9e072342"} Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.083807 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" event={"ID":"a44f8567-c35d-4bf4-be5c-ffbde539bb3a","Type":"ContainerStarted","Data":"f4aed4c3ee6a80701b7039f6c4b9ee7c71ed7c1b3d36ceafddb2c5c5a6e935e7"} Dec 05 06:08:49 crc kubenswrapper[4865]: I1205 06:08:49.097388 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" podStartSLOduration=42.097365067 podStartE2EDuration="42.097365067s" podCreationTimestamp="2025-12-05 06:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:08:49.095950518 +0000 UTC m=+948.375961740" watchObservedRunningTime="2025-12-05 06:08:49.097365067 +0000 UTC m=+948.377376309" Dec 05 06:08:52 crc kubenswrapper[4865]: I1205 06:08:52.128077 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" event={"ID":"60a54835-3802-4f32-be4f-ea7ace9084f6","Type":"ContainerStarted","Data":"387b1a55cf9efb7e5567e5536aabe8b1db310d4cabf429a8d208e2afe4c90389"} Dec 05 06:08:54 crc kubenswrapper[4865]: I1205 06:08:54.087921 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6f6696b64-hqh47" Dec 05 06:08:54 crc kubenswrapper[4865]: I1205 06:08:54.183906 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" event={"ID":"60a54835-3802-4f32-be4f-ea7ace9084f6","Type":"ContainerStarted","Data":"d62f5c864c2383bfd4a108679891c3f2dfe840857c659ac0fea86fb8fc7efe3c"} Dec 05 06:08:54 crc kubenswrapper[4865]: I1205 06:08:54.184223 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:54 crc kubenswrapper[4865]: I1205 06:08:54.225443 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" podStartSLOduration=4.155442807 podStartE2EDuration="48.225427124s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.256211037 +0000 UTC m=+908.536222259" lastFinishedPulling="2025-12-05 06:08:53.326195324 +0000 UTC m=+952.606206576" observedRunningTime="2025-12-05 06:08:54.217356241 +0000 UTC m=+953.497367453" watchObservedRunningTime="2025-12-05 06:08:54.225427124 +0000 UTC m=+953.505438346" Dec 05 06:08:54 crc kubenswrapper[4865]: E1205 06:08:54.368970 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" podUID="8d67bcae-4ae9-4545-8410-236efec0cc30" Dec 05 06:08:54 crc kubenswrapper[4865]: E1205 06:08:54.421548 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" podUID="c3d9f2e6-7658-4f43-8d62-72bd4305c06a" Dec 05 06:08:54 crc kubenswrapper[4865]: E1205 06:08:54.471485 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" podUID="0445a96f-f840-45c4-a1c3-f4455c49b216" Dec 05 06:08:54 crc kubenswrapper[4865]: E1205 06:08:54.516876 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" podUID="51ef47f4-9d56-4555-9a53-007c8648651a" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.195746 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" event={"ID":"8d67bcae-4ae9-4545-8410-236efec0cc30","Type":"ContainerStarted","Data":"2f56f6a1d6d0121a664a09e13bc5c026051445c5090604b1a673dafc9c87a43e"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.201336 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" event={"ID":"e13948be-6623-4815-af50-6e2b5ee807ba","Type":"ContainerStarted","Data":"8e1a30dae300136f027322586353a4004cb6f9236933b4f7ea9ba0ccb97a2b9e"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.201371 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" event={"ID":"e13948be-6623-4815-af50-6e2b5ee807ba","Type":"ContainerStarted","Data":"9b76356e9465786274ed9225b46b98bc4cb61867e0492962e8f54a7db343abbf"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.201491 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.202987 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" event={"ID":"1caf6bc1-a2e2-4330-bc4f-1f324ec5de84","Type":"ContainerStarted","Data":"10b31c5ab68dd67b5aa06bb75f5beef6195742c47bf97bacac2f4bcd9a7e1d2b"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.203207 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.204997 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.206613 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" event={"ID":"cf1398f2-aa09-45bb-9a98-5fadca999284","Type":"ContainerStarted","Data":"ebd957c140e2b67e78638aab546df7a787724fe930c4a4f65ee85533088ebcac"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.207561 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" event={"ID":"0445a96f-f840-45c4-a1c3-f4455c49b216","Type":"ContainerStarted","Data":"f734f14fcfb209c8f1d16e2b9a94b50ff4fb3f291d63d3d50b7b3ac9fbf61b1d"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.212212 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" event={"ID":"c21265ee-9968-411a-9387-f0c3920b3883","Type":"ContainerStarted","Data":"7010b52ff382b2c56d34f8772509c32b3916846c4260e1781ae7caff26297ab2"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.216085 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" event={"ID":"59231c2f-740e-4c04-af17-53dab82b3497","Type":"ContainerStarted","Data":"6e7197ce221c0ffc059ce5721a20bd13711ee4fb2693b1f4777eddfae7150be1"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.216703 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.217664 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" event={"ID":"c3d9f2e6-7658-4f43-8d62-72bd4305c06a","Type":"ContainerStarted","Data":"61d149c9e35cb0b51fa6c1514c30e12dc04f735a7c047f67ec6ddeab9c81ed80"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.218380 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.220490 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" event={"ID":"51ef47f4-9d56-4555-9a53-007c8648651a","Type":"ContainerStarted","Data":"42b7fee35173dd319c035c0389bded76a3c3f936142b3cc7688b41733d8d4370"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.243800 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-cdf4c" podStartSLOduration=4.65373179 podStartE2EDuration="49.243778772s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.390520774 +0000 UTC m=+908.670531996" lastFinishedPulling="2025-12-05 06:08:53.980567756 +0000 UTC m=+953.260578978" observedRunningTime="2025-12-05 06:08:55.239525514 +0000 UTC m=+954.519536736" watchObservedRunningTime="2025-12-05 06:08:55.243778772 +0000 UTC m=+954.523789994" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.244728 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" event={"ID":"a44f8567-c35d-4bf4-be5c-ffbde539bb3a","Type":"ContainerStarted","Data":"e57e4823de029a2625c90aac5395a1dddc7637987954ce688cdb9c32502edfee"} Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.244777 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.248306 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.300314 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7jl8d" podStartSLOduration=4.762765029 podStartE2EDuration="50.300293592s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:07.818233247 +0000 UTC m=+907.098244469" lastFinishedPulling="2025-12-05 06:08:53.3557618 +0000 UTC m=+952.635773032" observedRunningTime="2025-12-05 06:08:55.289801752 +0000 UTC m=+954.569812974" watchObservedRunningTime="2025-12-05 06:08:55.300293592 +0000 UTC m=+954.580304814" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.319982 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bg55s" podStartSLOduration=4.642048058 podStartE2EDuration="48.319955184s" podCreationTimestamp="2025-12-05 06:08:07 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.681769222 +0000 UTC m=+908.961780444" lastFinishedPulling="2025-12-05 06:08:53.359676338 +0000 UTC m=+952.639687570" observedRunningTime="2025-12-05 06:08:55.306419991 +0000 UTC m=+954.586431213" watchObservedRunningTime="2025-12-05 06:08:55.319955184 +0000 UTC m=+954.599966416" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.371483 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" podStartSLOduration=44.205463931 podStartE2EDuration="50.371463416s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:47.844650381 +0000 UTC m=+947.124661603" lastFinishedPulling="2025-12-05 06:08:54.010649866 +0000 UTC m=+953.290661088" observedRunningTime="2025-12-05 06:08:55.369045559 +0000 UTC m=+954.649056781" watchObservedRunningTime="2025-12-05 06:08:55.371463416 +0000 UTC m=+954.651474638" Dec 05 06:08:55 crc kubenswrapper[4865]: I1205 06:08:55.414823 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wrx8" podStartSLOduration=5.294816534 podStartE2EDuration="50.414800092s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.236013539 +0000 UTC m=+907.516024761" lastFinishedPulling="2025-12-05 06:08:53.355997097 +0000 UTC m=+952.636008319" observedRunningTime="2025-12-05 06:08:55.407949213 +0000 UTC m=+954.687960455" watchObservedRunningTime="2025-12-05 06:08:55.414800092 +0000 UTC m=+954.694811314" Dec 05 06:08:55 crc kubenswrapper[4865]: E1205 06:08:55.816095 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" podUID="cf1398f2-aa09-45bb-9a98-5fadca999284" Dec 05 06:08:55 crc kubenswrapper[4865]: E1205 06:08:55.838099 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" podUID="9b591a19-b272-4a03-8164-c0296161feb7" Dec 05 06:08:55 crc kubenswrapper[4865]: E1205 06:08:55.840030 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" podUID="1bad98dd-eca3-4f98-884a-655e104b2d92" Dec 05 06:08:55 crc kubenswrapper[4865]: E1205 06:08:55.854043 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" podUID="87bce1fb-16c2-4c47-aa02-3f94aa681b58" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.162760 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" podUID="8e1c4c0e-047b-4727-9435-7192e4f48bea" Dec 05 06:08:56 crc kubenswrapper[4865]: I1205 06:08:56.284713 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" event={"ID":"8e1c4c0e-047b-4727-9435-7192e4f48bea","Type":"ContainerStarted","Data":"9e65c89595edf0cc5d4c93fabcfa6e473b1607d84674b10b144a7be29d706ab7"} Dec 05 06:08:56 crc kubenswrapper[4865]: I1205 06:08:56.287290 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" event={"ID":"87bce1fb-16c2-4c47-aa02-3f94aa681b58","Type":"ContainerStarted","Data":"8a90749a1091bcb2774f051628e029906c605351f71b109a2aeed39663b63a6f"} Dec 05 06:08:56 crc kubenswrapper[4865]: I1205 06:08:56.293621 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" event={"ID":"1bad98dd-eca3-4f98-884a-655e104b2d92","Type":"ContainerStarted","Data":"4c1c2f0ae11ec58b028443cea8a70eec176349490d2d36b50d1e80db5326ad69"} Dec 05 06:08:56 crc kubenswrapper[4865]: I1205 06:08:56.296065 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" event={"ID":"9b591a19-b272-4a03-8164-c0296161feb7","Type":"ContainerStarted","Data":"37ed7792a8208575abb43388a857b62b4a328220efa9027bb0b87a3fcc3408bc"} Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.305193 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" podUID="9b591a19-b272-4a03-8164-c0296161feb7" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.306079 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" podUID="cf1398f2-aa09-45bb-9a98-5fadca999284" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.427209 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" podUID="30f6dc0d-1962-42c0-a128-d7a54943d849" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.542952 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" podUID="571eed7b-c231-42db-8acd-8f2efc828947" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.727267 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" podUID="2364f477-be51-4698-914a-94d0fd2dd983" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.778731 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" podUID="db94fe25-0c93-4471-852d-45b20c0f266c" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.829319 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" podUID="f4fc5327-1468-48aa-9a51-e8be8bfb5629" Dec 05 06:08:56 crc kubenswrapper[4865]: E1205 06:08:56.944381 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" podUID="1363659b-58f9-4f41-800c-863dd656d2b8" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.158094 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-vjgsh" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.308855 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" event={"ID":"2364f477-be51-4698-914a-94d0fd2dd983","Type":"ContainerStarted","Data":"a35c41a90c939d3fa4c4006edc303401b2597055ddc1400e76f666d6e314661b"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.317892 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" event={"ID":"0445a96f-f840-45c4-a1c3-f4455c49b216","Type":"ContainerStarted","Data":"abcbfa3886065e0a666f83714c99c475d3d1ee1ddce622d1f224f54c6fae0cfd"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.318017 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.328268 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" event={"ID":"30f6dc0d-1962-42c0-a128-d7a54943d849","Type":"ContainerStarted","Data":"5922fd8950ab15ac015e75a1356dad5d4e0bc915d134e7a9588a954918421db8"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.338343 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" event={"ID":"1363659b-58f9-4f41-800c-863dd656d2b8","Type":"ContainerStarted","Data":"21538e2b1a824959dc46a90cc2a92fa9ace97c458617a5d08c0585e6a8e1ad08"} Dec 05 06:08:57 crc kubenswrapper[4865]: E1205 06:08:57.342447 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" podUID="1363659b-58f9-4f41-800c-863dd656d2b8" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.348924 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" event={"ID":"571eed7b-c231-42db-8acd-8f2efc828947","Type":"ContainerStarted","Data":"672b775d8d19f63ad147070c4c8777e6dffe8b8b2c085344cb4c331202191750"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.357333 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" event={"ID":"f4fc5327-1468-48aa-9a51-e8be8bfb5629","Type":"ContainerStarted","Data":"755222acd514366135eb0ed2898f4da8f7fa4ac06c18c79e17b94c3b11b86a5b"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.366926 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" podStartSLOduration=4.994826234 podStartE2EDuration="51.366907632s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.708925202 +0000 UTC m=+908.988936424" lastFinishedPulling="2025-12-05 06:08:56.0810066 +0000 UTC m=+955.361017822" observedRunningTime="2025-12-05 06:08:57.363974641 +0000 UTC m=+956.643985873" watchObservedRunningTime="2025-12-05 06:08:57.366907632 +0000 UTC m=+956.646918854" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.369871 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" event={"ID":"c3d9f2e6-7658-4f43-8d62-72bd4305c06a","Type":"ContainerStarted","Data":"0f91539313fad9522a5c8e0eb20cdb632554114fb1f8715181473e919bbdb197"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.372411 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" event={"ID":"1bad98dd-eca3-4f98-884a-655e104b2d92","Type":"ContainerStarted","Data":"13a1a67874e799d549123144eb909f222d499dc4a1db96c8f71ca496d01ed8d0"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.385031 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.391296 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" event={"ID":"51ef47f4-9d56-4555-9a53-007c8648651a","Type":"ContainerStarted","Data":"fe563f5ed6469c6361130fa4f060e4b3fadee98d6a8b0c8c13ceeabe7570e519"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.391429 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.396537 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" event={"ID":"8d67bcae-4ae9-4545-8410-236efec0cc30","Type":"ContainerStarted","Data":"8ebca25c4dee8c45ead16596a0e94e37bbaf0a3e9d1e77c53895f914e9ae049e"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.397320 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.412659 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" event={"ID":"87bce1fb-16c2-4c47-aa02-3f94aa681b58","Type":"ContainerStarted","Data":"089b21b02e7574d859f62fc48bc9ada1f0887d91392bd657d476c39f9efe2de6"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.414043 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.431979 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" event={"ID":"db94fe25-0c93-4471-852d-45b20c0f266c","Type":"ContainerStarted","Data":"ad413bd3446b2a8f8e019cff1e84b2975e0c1f4474d4b31f19745fcc877a9676"} Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.495712 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" podStartSLOduration=5.53271255 podStartE2EDuration="52.495692076s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.309092627 +0000 UTC m=+908.589103849" lastFinishedPulling="2025-12-05 06:08:56.272072153 +0000 UTC m=+955.552083375" observedRunningTime="2025-12-05 06:08:57.469248086 +0000 UTC m=+956.749259308" watchObservedRunningTime="2025-12-05 06:08:57.495692076 +0000 UTC m=+956.775703298" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.560039 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" podStartSLOduration=4.639704663 podStartE2EDuration="52.560020072s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.912758898 +0000 UTC m=+908.192770120" lastFinishedPulling="2025-12-05 06:08:56.833074307 +0000 UTC m=+956.113085529" observedRunningTime="2025-12-05 06:08:57.52479676 +0000 UTC m=+956.804807982" watchObservedRunningTime="2025-12-05 06:08:57.560020072 +0000 UTC m=+956.840031294" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.566706 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" podStartSLOduration=4.128077742 podStartE2EDuration="51.566692166s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.309707974 +0000 UTC m=+908.589719196" lastFinishedPulling="2025-12-05 06:08:56.748322398 +0000 UTC m=+956.028333620" observedRunningTime="2025-12-05 06:08:57.562024767 +0000 UTC m=+956.842035989" watchObservedRunningTime="2025-12-05 06:08:57.566692166 +0000 UTC m=+956.846703388" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.656631 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" podStartSLOduration=5.073753473 podStartE2EDuration="51.656609308s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.398788423 +0000 UTC m=+908.678799645" lastFinishedPulling="2025-12-05 06:08:55.981644258 +0000 UTC m=+955.261655480" observedRunningTime="2025-12-05 06:08:57.652060292 +0000 UTC m=+956.932071524" watchObservedRunningTime="2025-12-05 06:08:57.656609308 +0000 UTC m=+956.936620530" Dec 05 06:08:57 crc kubenswrapper[4865]: I1205 06:08:57.676536 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" podStartSLOduration=4.517518499 podStartE2EDuration="51.676520597s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.920673056 +0000 UTC m=+908.200684278" lastFinishedPulling="2025-12-05 06:08:56.079675154 +0000 UTC m=+955.359686376" observedRunningTime="2025-12-05 06:08:57.675098288 +0000 UTC m=+956.955109510" watchObservedRunningTime="2025-12-05 06:08:57.676520597 +0000 UTC m=+956.956531819" Dec 05 06:08:58 crc kubenswrapper[4865]: I1205 06:08:58.439931 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" event={"ID":"8e1c4c0e-047b-4727-9435-7192e4f48bea","Type":"ContainerStarted","Data":"f2859708db9a5cf2ab7cf5aae8c1555a594e259274a10c11e4fa005350215191"} Dec 05 06:08:58 crc kubenswrapper[4865]: I1205 06:08:58.440867 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:08:58 crc kubenswrapper[4865]: I1205 06:08:58.463799 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" podStartSLOduration=5.012709019 podStartE2EDuration="52.463782037s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.404203662 +0000 UTC m=+908.684214884" lastFinishedPulling="2025-12-05 06:08:56.85527668 +0000 UTC m=+956.135287902" observedRunningTime="2025-12-05 06:08:58.458161541 +0000 UTC m=+957.738172763" watchObservedRunningTime="2025-12-05 06:08:58.463782037 +0000 UTC m=+957.743793259" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.456067 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" event={"ID":"571eed7b-c231-42db-8acd-8f2efc828947","Type":"ContainerStarted","Data":"c965bcff8600de72c00434d3da09e25e7714fa7bbfcb9b1ffe8ff80074fe5d24"} Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.456450 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.461674 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" event={"ID":"f4fc5327-1468-48aa-9a51-e8be8bfb5629","Type":"ContainerStarted","Data":"1f288405f29013aa9d48fc4a563b9200e563caef0263d88055dbf5ccaba6bb51"} Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.461915 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.466450 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" event={"ID":"db94fe25-0c93-4471-852d-45b20c0f266c","Type":"ContainerStarted","Data":"af82afeaab0e0b0bf2540add23d98f845a7422453755afa85da0be446dad3fe6"} Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.466624 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.471140 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" event={"ID":"2364f477-be51-4698-914a-94d0fd2dd983","Type":"ContainerStarted","Data":"c9c9874b871073a620ab4af3df5555c68c8bbed68438600b0b00b4756e9c7874"} Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.471697 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.474139 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" event={"ID":"30f6dc0d-1962-42c0-a128-d7a54943d849","Type":"ContainerStarted","Data":"296475a1b3be490b450f8894fb3822e118f38b7f9b7f5f7cd8895730a7b2b74c"} Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.474773 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.475331 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.516856 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" podStartSLOduration=4.080081927 podStartE2EDuration="53.516830192s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.296461958 +0000 UTC m=+908.576473180" lastFinishedPulling="2025-12-05 06:08:58.733210223 +0000 UTC m=+958.013221445" observedRunningTime="2025-12-05 06:08:59.482680239 +0000 UTC m=+958.762691461" watchObservedRunningTime="2025-12-05 06:08:59.516830192 +0000 UTC m=+958.796841404" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.516995 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" podStartSLOduration=3.742362956 podStartE2EDuration="53.516991946s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.967431557 +0000 UTC m=+908.247442769" lastFinishedPulling="2025-12-05 06:08:58.742060537 +0000 UTC m=+958.022071759" observedRunningTime="2025-12-05 06:08:59.511602078 +0000 UTC m=+958.791613300" watchObservedRunningTime="2025-12-05 06:08:59.516991946 +0000 UTC m=+958.797003168" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.534016 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" podStartSLOduration=4.541956294 podStartE2EDuration="54.534008246s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.890003289 +0000 UTC m=+908.170014511" lastFinishedPulling="2025-12-05 06:08:58.882055241 +0000 UTC m=+958.162066463" observedRunningTime="2025-12-05 06:08:59.531691452 +0000 UTC m=+958.811702674" watchObservedRunningTime="2025-12-05 06:08:59.534008246 +0000 UTC m=+958.814019468" Dec 05 06:08:59 crc kubenswrapper[4865]: I1205 06:08:59.563573 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" podStartSLOduration=4.817575612 podStartE2EDuration="54.563555202s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.833000106 +0000 UTC m=+908.113011328" lastFinishedPulling="2025-12-05 06:08:58.578979696 +0000 UTC m=+957.858990918" observedRunningTime="2025-12-05 06:08:59.556077845 +0000 UTC m=+958.836089067" watchObservedRunningTime="2025-12-05 06:08:59.563555202 +0000 UTC m=+958.843566424" Dec 05 06:09:02 crc kubenswrapper[4865]: I1205 06:09:02.152128 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-758b7cbd9c-d2qcb" Dec 05 06:09:02 crc kubenswrapper[4865]: I1205 06:09:02.175492 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" podStartSLOduration=6.671132081 podStartE2EDuration="57.175465672s" podCreationTimestamp="2025-12-05 06:08:05 +0000 UTC" firstStartedPulling="2025-12-05 06:08:08.296232731 +0000 UTC m=+907.576243953" lastFinishedPulling="2025-12-05 06:08:58.800566322 +0000 UTC m=+958.080577544" observedRunningTime="2025-12-05 06:08:59.603203546 +0000 UTC m=+958.883214768" watchObservedRunningTime="2025-12-05 06:09:02.175465672 +0000 UTC m=+961.455476894" Dec 05 06:09:02 crc kubenswrapper[4865]: I1205 06:09:02.499222 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" event={"ID":"2d41068d-3439-4a1d-bb73-9d974c281d4c","Type":"ContainerStarted","Data":"f71ef094fcbc8a4ba883b59fdcac59c583d65abb6914f941f8acbe1e52943edf"} Dec 05 06:09:02 crc kubenswrapper[4865]: I1205 06:09:02.499267 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" event={"ID":"2d41068d-3439-4a1d-bb73-9d974c281d4c","Type":"ContainerStarted","Data":"2e54a4c9e6b58579315fba7010afdf03d214544a655d494e7119a4339c080023"} Dec 05 06:09:02 crc kubenswrapper[4865]: I1205 06:09:02.500048 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:09:02 crc kubenswrapper[4865]: I1205 06:09:02.530936 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" podStartSLOduration=42.817626926 podStartE2EDuration="56.530900112s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:47.858345819 +0000 UTC m=+947.138357051" lastFinishedPulling="2025-12-05 06:09:01.571619005 +0000 UTC m=+960.851630237" observedRunningTime="2025-12-05 06:09:02.526225553 +0000 UTC m=+961.806236835" watchObservedRunningTime="2025-12-05 06:09:02.530900112 +0000 UTC m=+961.810911374" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.188998 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-97l79" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.208972 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-j6st7" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.335165 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kkstd" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.497125 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8zkdr" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.621378 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-r8f45" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.697674 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mlcgh" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.933125 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v25kd" Dec 05 06:09:06 crc kubenswrapper[4865]: I1205 06:09:06.981836 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-cv8vc" Dec 05 06:09:07 crc kubenswrapper[4865]: I1205 06:09:07.061168 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7nqrp" Dec 05 06:09:07 crc kubenswrapper[4865]: I1205 06:09:07.183225 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4546x" Dec 05 06:09:07 crc kubenswrapper[4865]: I1205 06:09:07.519964 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2jh72" Dec 05 06:09:07 crc kubenswrapper[4865]: I1205 06:09:07.932993 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-qphvq" Dec 05 06:09:11 crc kubenswrapper[4865]: I1205 06:09:11.014192 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.597017 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" event={"ID":"cf1398f2-aa09-45bb-9a98-5fadca999284","Type":"ContainerStarted","Data":"ae29685ffe98f43b2f8aeba3e0d1949d3773cd8ffed917b086e2d7c3fed8e002"} Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.598534 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.602174 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" event={"ID":"9b591a19-b272-4a03-8164-c0296161feb7","Type":"ContainerStarted","Data":"101493e34f05ae78511092946723d6d60264249400c1ecf1ab830ac1167f1ae7"} Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.602769 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.605007 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" event={"ID":"1363659b-58f9-4f41-800c-863dd656d2b8","Type":"ContainerStarted","Data":"aebf40dab1f50fc85009ba9e6460f1110fa117303a95d624da83df081cb72b39"} Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.605441 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.623591 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" podStartSLOduration=4.920421639 podStartE2EDuration="1m6.623567888s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.714532546 +0000 UTC m=+908.994543768" lastFinishedPulling="2025-12-05 06:09:11.417678805 +0000 UTC m=+970.697690017" observedRunningTime="2025-12-05 06:09:12.619794214 +0000 UTC m=+971.899805466" watchObservedRunningTime="2025-12-05 06:09:12.623567888 +0000 UTC m=+971.903579130" Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.647145 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" podStartSLOduration=4.852383161 podStartE2EDuration="1m6.647115528s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.709567789 +0000 UTC m=+908.989579011" lastFinishedPulling="2025-12-05 06:09:11.504300146 +0000 UTC m=+970.784311378" observedRunningTime="2025-12-05 06:09:12.646356747 +0000 UTC m=+971.926367979" watchObservedRunningTime="2025-12-05 06:09:12.647115528 +0000 UTC m=+971.927126760" Dec 05 06:09:12 crc kubenswrapper[4865]: I1205 06:09:12.670860 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" podStartSLOduration=4.919462184 podStartE2EDuration="1m6.670820043s" podCreationTimestamp="2025-12-05 06:08:06 +0000 UTC" firstStartedPulling="2025-12-05 06:08:09.751339842 +0000 UTC m=+909.031351064" lastFinishedPulling="2025-12-05 06:09:11.502697691 +0000 UTC m=+970.782708923" observedRunningTime="2025-12-05 06:09:12.664472937 +0000 UTC m=+971.944484189" watchObservedRunningTime="2025-12-05 06:09:12.670820043 +0000 UTC m=+971.950831285" Dec 05 06:09:13 crc kubenswrapper[4865]: I1205 06:09:13.348654 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" Dec 05 06:09:17 crc kubenswrapper[4865]: I1205 06:09:17.966326 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cppdn" Dec 05 06:09:17 crc kubenswrapper[4865]: I1205 06:09:17.983671 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j5vmw" Dec 05 06:09:18 crc kubenswrapper[4865]: I1205 06:09:18.239939 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-zzb4b" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.234437 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2cb9"] Dec 05 06:09:32 crc kubenswrapper[4865]: E1205 06:09:32.236716 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="extract-content" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.236808 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="extract-content" Dec 05 06:09:32 crc kubenswrapper[4865]: E1205 06:09:32.240253 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="extract-utilities" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.240373 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="extract-utilities" Dec 05 06:09:32 crc kubenswrapper[4865]: E1205 06:09:32.240491 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="registry-server" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.240556 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="registry-server" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.241120 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b2e75d-6906-4a48-953d-647cbc08256d" containerName="registry-server" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.248492 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.257536 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.258084 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6dnf9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.258386 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.258730 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.303855 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2cb9"] Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.332992 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pvtgk"] Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.334305 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.340152 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.342870 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pvtgk"] Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.356538 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlgw7\" (UniqueName: \"kubernetes.io/projected/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-kube-api-access-rlgw7\") pod \"dnsmasq-dns-675f4bcbfc-g2cb9\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.356818 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-config\") pod \"dnsmasq-dns-675f4bcbfc-g2cb9\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.458298 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlgw7\" (UniqueName: \"kubernetes.io/projected/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-kube-api-access-rlgw7\") pod \"dnsmasq-dns-675f4bcbfc-g2cb9\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.458343 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k78f\" (UniqueName: \"kubernetes.io/projected/51f79fb1-29a1-479c-8574-0b2d7d549c3b-kube-api-access-2k78f\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.458414 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-config\") pod \"dnsmasq-dns-675f4bcbfc-g2cb9\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.458477 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-config\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.458499 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.459430 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-config\") pod \"dnsmasq-dns-675f4bcbfc-g2cb9\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.486301 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlgw7\" (UniqueName: \"kubernetes.io/projected/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-kube-api-access-rlgw7\") pod \"dnsmasq-dns-675f4bcbfc-g2cb9\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.559930 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-config\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.559990 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.560026 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k78f\" (UniqueName: \"kubernetes.io/projected/51f79fb1-29a1-479c-8574-0b2d7d549c3b-kube-api-access-2k78f\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.561557 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.561711 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-config\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.578957 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k78f\" (UniqueName: \"kubernetes.io/projected/51f79fb1-29a1-479c-8574-0b2d7d549c3b-kube-api-access-2k78f\") pod \"dnsmasq-dns-78dd6ddcc-pvtgk\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.597816 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:09:32 crc kubenswrapper[4865]: I1205 06:09:32.660384 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:09:33 crc kubenswrapper[4865]: I1205 06:09:33.112782 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2cb9"] Dec 05 06:09:33 crc kubenswrapper[4865]: I1205 06:09:33.245651 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pvtgk"] Dec 05 06:09:33 crc kubenswrapper[4865]: W1205 06:09:33.248163 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51f79fb1_29a1_479c_8574_0b2d7d549c3b.slice/crio-a461ffb481aeb35da1a66cbfd9551ed494dcca48b9ed992d83ed2ac8e7ccbfc3 WatchSource:0}: Error finding container a461ffb481aeb35da1a66cbfd9551ed494dcca48b9ed992d83ed2ac8e7ccbfc3: Status 404 returned error can't find the container with id a461ffb481aeb35da1a66cbfd9551ed494dcca48b9ed992d83ed2ac8e7ccbfc3 Dec 05 06:09:33 crc kubenswrapper[4865]: I1205 06:09:33.803153 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" event={"ID":"51f79fb1-29a1-479c-8574-0b2d7d549c3b","Type":"ContainerStarted","Data":"a461ffb481aeb35da1a66cbfd9551ed494dcca48b9ed992d83ed2ac8e7ccbfc3"} Dec 05 06:09:33 crc kubenswrapper[4865]: I1205 06:09:33.804485 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" event={"ID":"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d","Type":"ContainerStarted","Data":"cdbced58f2381cb8c3cd20251b7f8e5b9e961abf56045d6ffad7ead808dc51f8"} Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.245307 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2cb9"] Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.277604 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pw9js"] Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.278809 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.301199 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pw9js"] Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.406393 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwcd4\" (UniqueName: \"kubernetes.io/projected/273190c6-096d-4098-b426-7fa1854b09f8-kube-api-access-jwcd4\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.406450 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.406491 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-config\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.508097 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-config\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.508235 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwcd4\" (UniqueName: \"kubernetes.io/projected/273190c6-096d-4098-b426-7fa1854b09f8-kube-api-access-jwcd4\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.508281 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.509923 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.509937 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-config\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.555214 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwcd4\" (UniqueName: \"kubernetes.io/projected/273190c6-096d-4098-b426-7fa1854b09f8-kube-api-access-jwcd4\") pod \"dnsmasq-dns-666b6646f7-pw9js\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.571651 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pvtgk"] Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.596508 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.605523 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9tvv7"] Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.608220 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.633015 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9tvv7"] Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.714105 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-config\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.714163 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqj5r\" (UniqueName: \"kubernetes.io/projected/a4572a2a-3255-485e-91ea-69fe52b1e3a6-kube-api-access-zqj5r\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.714202 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.816908 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-config\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.816949 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqj5r\" (UniqueName: \"kubernetes.io/projected/a4572a2a-3255-485e-91ea-69fe52b1e3a6-kube-api-access-zqj5r\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.816984 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.819203 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.819671 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-config\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.846979 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqj5r\" (UniqueName: \"kubernetes.io/projected/a4572a2a-3255-485e-91ea-69fe52b1e3a6-kube-api-access-zqj5r\") pod \"dnsmasq-dns-57d769cc4f-9tvv7\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:35 crc kubenswrapper[4865]: I1205 06:09:35.930965 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.245135 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pw9js"] Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.337782 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9tvv7"] Dec 05 06:09:36 crc kubenswrapper[4865]: W1205 06:09:36.366606 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4572a2a_3255_485e_91ea_69fe52b1e3a6.slice/crio-878d29538bb298d3c48b6e4867c744f5e3bcebc0add1fea5b4ae94b94ff79d1a WatchSource:0}: Error finding container 878d29538bb298d3c48b6e4867c744f5e3bcebc0add1fea5b4ae94b94ff79d1a: Status 404 returned error can't find the container with id 878d29538bb298d3c48b6e4867c744f5e3bcebc0add1fea5b4ae94b94ff79d1a Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.439998 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.441318 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.447522 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.447808 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.447992 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.448174 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.448319 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.448452 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.451787 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b2hp7" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.468964 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538511 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538553 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538585 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538602 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538625 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538640 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538684 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538704 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538724 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw2t\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-kube-api-access-jvw2t\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538741 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.538766 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.639866 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.639942 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.639968 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.639998 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640019 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640039 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640054 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640100 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640123 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640143 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw2t\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-kube-api-access-jvw2t\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.640163 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.641302 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.642123 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.642387 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.642408 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.642619 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.641375 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.652696 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.653106 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.659257 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.659482 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.674489 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.690234 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw2t\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-kube-api-access-jvw2t\") pod \"rabbitmq-server-0\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.772860 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.795236 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.800495 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.807959 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809106 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809357 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ds6zh" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809648 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809382 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809526 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809574 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.809618 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846547 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846602 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4eacf2-62b6-48a0-9650-77e19a6db904-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846621 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846644 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846666 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846688 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846705 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846726 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846746 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4eacf2-62b6-48a0-9650-77e19a6db904-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846764 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99nb\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-kube-api-access-d99nb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.846791 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.884166 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" event={"ID":"273190c6-096d-4098-b426-7fa1854b09f8","Type":"ContainerStarted","Data":"a9431d9e41571c77961e5038e99f6820c8b44de0c2a70c406f095d7380deb848"} Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.893554 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" event={"ID":"a4572a2a-3255-485e-91ea-69fe52b1e3a6","Type":"ContainerStarted","Data":"878d29538bb298d3c48b6e4867c744f5e3bcebc0add1fea5b4ae94b94ff79d1a"} Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949070 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949176 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949205 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4eacf2-62b6-48a0-9650-77e19a6db904-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949224 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949256 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949285 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949316 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949337 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949361 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949389 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4eacf2-62b6-48a0-9650-77e19a6db904-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.949414 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d99nb\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-kube-api-access-d99nb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.950238 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.952238 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.952592 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.952691 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.953682 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.954719 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.956520 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4eacf2-62b6-48a0-9650-77e19a6db904-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.959717 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.969124 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.971898 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4eacf2-62b6-48a0-9650-77e19a6db904-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.974234 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d99nb\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-kube-api-access-d99nb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:36 crc kubenswrapper[4865]: I1205 06:09:36.991670 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:37 crc kubenswrapper[4865]: I1205 06:09:37.168104 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:09:37 crc kubenswrapper[4865]: I1205 06:09:37.528935 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:09:37 crc kubenswrapper[4865]: W1205 06:09:37.569317 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb655d6e9_628e_4fbb_a0c4_fc46a71b9ff3.slice/crio-6b031c927d1ba7e31c906af03a3e3c5733431cd688c4399aad5b24d5a5cc6be2 WatchSource:0}: Error finding container 6b031c927d1ba7e31c906af03a3e3c5733431cd688c4399aad5b24d5a5cc6be2: Status 404 returned error can't find the container with id 6b031c927d1ba7e31c906af03a3e3c5733431cd688c4399aad5b24d5a5cc6be2 Dec 05 06:09:37 crc kubenswrapper[4865]: I1205 06:09:37.781095 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:09:37 crc kubenswrapper[4865]: W1205 06:09:37.808319 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4eacf2_62b6_48a0_9650_77e19a6db904.slice/crio-b95c6f356ce9612272cd648b356e6894566ef6dd498e1dfff75931f4fcf8fc82 WatchSource:0}: Error finding container b95c6f356ce9612272cd648b356e6894566ef6dd498e1dfff75931f4fcf8fc82: Status 404 returned error can't find the container with id b95c6f356ce9612272cd648b356e6894566ef6dd498e1dfff75931f4fcf8fc82 Dec 05 06:09:37 crc kubenswrapper[4865]: I1205 06:09:37.910028 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3","Type":"ContainerStarted","Data":"6b031c927d1ba7e31c906af03a3e3c5733431cd688c4399aad5b24d5a5cc6be2"} Dec 05 06:09:37 crc kubenswrapper[4865]: I1205 06:09:37.916950 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff4eacf2-62b6-48a0-9650-77e19a6db904","Type":"ContainerStarted","Data":"b95c6f356ce9612272cd648b356e6894566ef6dd498e1dfff75931f4fcf8fc82"} Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.166292 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.169446 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.185608 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.189705 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.190034 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.190258 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.191174 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bvn5q" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.203982 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290574 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290614 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rfxh\" (UniqueName: \"kubernetes.io/projected/8a62a048-0ebe-4e5e-988a-4dde7746af74-kube-api-access-2rfxh\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290639 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-config-data-default\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290668 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290700 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-kolla-config\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290730 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a62a048-0ebe-4e5e-988a-4dde7746af74-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290758 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a62a048-0ebe-4e5e-988a-4dde7746af74-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.290778 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a62a048-0ebe-4e5e-988a-4dde7746af74-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391746 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a62a048-0ebe-4e5e-988a-4dde7746af74-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391815 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a62a048-0ebe-4e5e-988a-4dde7746af74-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391863 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a62a048-0ebe-4e5e-988a-4dde7746af74-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391905 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rfxh\" (UniqueName: \"kubernetes.io/projected/8a62a048-0ebe-4e5e-988a-4dde7746af74-kube-api-access-2rfxh\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391921 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391943 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-config-data-default\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.391968 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.392003 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-kolla-config\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.392635 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a62a048-0ebe-4e5e-988a-4dde7746af74-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.392815 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-kolla-config\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.393035 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.393383 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-config-data-default\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.394006 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a62a048-0ebe-4e5e-988a-4dde7746af74-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.402855 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a62a048-0ebe-4e5e-988a-4dde7746af74-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.405037 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a62a048-0ebe-4e5e-988a-4dde7746af74-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.427251 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.445657 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rfxh\" (UniqueName: \"kubernetes.io/projected/8a62a048-0ebe-4e5e-988a-4dde7746af74-kube-api-access-2rfxh\") pod \"openstack-galera-0\" (UID: \"8a62a048-0ebe-4e5e-988a-4dde7746af74\") " pod="openstack/openstack-galera-0" Dec 05 06:09:38 crc kubenswrapper[4865]: I1205 06:09:38.515887 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.394061 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 06:09:39 crc kubenswrapper[4865]: W1205 06:09:39.437021 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a62a048_0ebe_4e5e_988a_4dde7746af74.slice/crio-2aede71d890c50d91c69126fdfbba4d15398e70b311dfbede62723dc4aeee514 WatchSource:0}: Error finding container 2aede71d890c50d91c69126fdfbba4d15398e70b311dfbede62723dc4aeee514: Status 404 returned error can't find the container with id 2aede71d890c50d91c69126fdfbba4d15398e70b311dfbede62723dc4aeee514 Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.510686 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.516991 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.529542 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.529740 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.529886 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qv7md" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.530012 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.540355 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.658962 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659008 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659031 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8g4s\" (UniqueName: \"kubernetes.io/projected/9a70babd-c8a6-442f-aa44-d013f3887c93-kube-api-access-g8g4s\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659056 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a70babd-c8a6-442f-aa44-d013f3887c93-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659093 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659151 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a70babd-c8a6-442f-aa44-d013f3887c93-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659173 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.659188 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a70babd-c8a6-442f-aa44-d013f3887c93-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764583 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a70babd-c8a6-442f-aa44-d013f3887c93-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764633 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764652 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a70babd-c8a6-442f-aa44-d013f3887c93-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764696 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764714 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764734 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8g4s\" (UniqueName: \"kubernetes.io/projected/9a70babd-c8a6-442f-aa44-d013f3887c93-kube-api-access-g8g4s\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764751 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a70babd-c8a6-442f-aa44-d013f3887c93-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.764784 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.765611 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.768128 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.768859 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a70babd-c8a6-442f-aa44-d013f3887c93-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.770759 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.771699 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a70babd-c8a6-442f-aa44-d013f3887c93-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.774633 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a70babd-c8a6-442f-aa44-d013f3887c93-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.786303 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.787337 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.790270 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.790433 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-b9gww" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.790612 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.801998 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a70babd-c8a6-442f-aa44-d013f3887c93-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.810119 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8g4s\" (UniqueName: \"kubernetes.io/projected/9a70babd-c8a6-442f-aa44-d013f3887c93-kube-api-access-g8g4s\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.828339 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.840569 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a70babd-c8a6-442f-aa44-d013f3887c93\") " pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.869119 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.870256 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bspg\" (UniqueName: \"kubernetes.io/projected/05035a7d-0d83-46dd-a889-3db64fb647e8-kube-api-access-5bspg\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.870292 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05035a7d-0d83-46dd-a889-3db64fb647e8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.870544 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05035a7d-0d83-46dd-a889-3db64fb647e8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.870573 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05035a7d-0d83-46dd-a889-3db64fb647e8-kolla-config\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.870593 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05035a7d-0d83-46dd-a889-3db64fb647e8-config-data\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.964798 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8a62a048-0ebe-4e5e-988a-4dde7746af74","Type":"ContainerStarted","Data":"2aede71d890c50d91c69126fdfbba4d15398e70b311dfbede62723dc4aeee514"} Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.973681 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05035a7d-0d83-46dd-a889-3db64fb647e8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.973741 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05035a7d-0d83-46dd-a889-3db64fb647e8-kolla-config\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.973772 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05035a7d-0d83-46dd-a889-3db64fb647e8-config-data\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.973818 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bspg\" (UniqueName: \"kubernetes.io/projected/05035a7d-0d83-46dd-a889-3db64fb647e8-kube-api-access-5bspg\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.973865 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05035a7d-0d83-46dd-a889-3db64fb647e8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.979322 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05035a7d-0d83-46dd-a889-3db64fb647e8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.979895 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05035a7d-0d83-46dd-a889-3db64fb647e8-kolla-config\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:39 crc kubenswrapper[4865]: I1205 06:09:39.980430 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05035a7d-0d83-46dd-a889-3db64fb647e8-config-data\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:40 crc kubenswrapper[4865]: I1205 06:09:40.020181 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05035a7d-0d83-46dd-a889-3db64fb647e8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:40 crc kubenswrapper[4865]: I1205 06:09:40.029771 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bspg\" (UniqueName: \"kubernetes.io/projected/05035a7d-0d83-46dd-a889-3db64fb647e8-kube-api-access-5bspg\") pod \"memcached-0\" (UID: \"05035a7d-0d83-46dd-a889-3db64fb647e8\") " pod="openstack/memcached-0" Dec 05 06:09:40 crc kubenswrapper[4865]: I1205 06:09:40.245124 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 06:09:40 crc kubenswrapper[4865]: I1205 06:09:40.854491 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 06:09:40 crc kubenswrapper[4865]: I1205 06:09:40.972742 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.328783 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.330164 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.335235 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-pppxh" Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.357700 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.430968 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgdbq\" (UniqueName: \"kubernetes.io/projected/1b43a6d5-6af1-413b-bfec-2607a76cc294-kube-api-access-jgdbq\") pod \"kube-state-metrics-0\" (UID: \"1b43a6d5-6af1-413b-bfec-2607a76cc294\") " pod="openstack/kube-state-metrics-0" Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.532294 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgdbq\" (UniqueName: \"kubernetes.io/projected/1b43a6d5-6af1-413b-bfec-2607a76cc294-kube-api-access-jgdbq\") pod \"kube-state-metrics-0\" (UID: \"1b43a6d5-6af1-413b-bfec-2607a76cc294\") " pod="openstack/kube-state-metrics-0" Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.554085 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgdbq\" (UniqueName: \"kubernetes.io/projected/1b43a6d5-6af1-413b-bfec-2607a76cc294-kube-api-access-jgdbq\") pod \"kube-state-metrics-0\" (UID: \"1b43a6d5-6af1-413b-bfec-2607a76cc294\") " pod="openstack/kube-state-metrics-0" Dec 05 06:09:42 crc kubenswrapper[4865]: I1205 06:09:42.663226 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.053247 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56dth"] Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.054330 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jbvmz"] Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.054501 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.057243 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.059943 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56dth"] Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.060526 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.060925 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.061040 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nmtt4" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.066499 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jbvmz"] Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208129 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-run-ovn\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208195 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-run\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208235 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-etc-ovs\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208258 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-run\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208273 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-lib\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208304 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30eebd2b-aed6-4866-bec4-da326d89821c-combined-ca-bundle\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208323 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-log-ovn\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208344 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j82lp\" (UniqueName: \"kubernetes.io/projected/b92328b7-456b-45ce-8416-765f465ac793-kube-api-access-j82lp\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208362 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-log\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208415 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b92328b7-456b-45ce-8416-765f465ac793-scripts\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208445 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30eebd2b-aed6-4866-bec4-da326d89821c-scripts\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208471 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv62g\" (UniqueName: \"kubernetes.io/projected/30eebd2b-aed6-4866-bec4-da326d89821c-kube-api-access-cv62g\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.208487 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/30eebd2b-aed6-4866-bec4-da326d89821c-ovn-controller-tls-certs\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310145 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-log\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310224 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b92328b7-456b-45ce-8416-765f465ac793-scripts\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310253 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30eebd2b-aed6-4866-bec4-da326d89821c-scripts\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310281 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv62g\" (UniqueName: \"kubernetes.io/projected/30eebd2b-aed6-4866-bec4-da326d89821c-kube-api-access-cv62g\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310328 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/30eebd2b-aed6-4866-bec4-da326d89821c-ovn-controller-tls-certs\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310357 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-run-ovn\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310388 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-run\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310423 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-etc-ovs\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310447 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-lib\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310463 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-run\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310498 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30eebd2b-aed6-4866-bec4-da326d89821c-combined-ca-bundle\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310518 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-log-ovn\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310538 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j82lp\" (UniqueName: \"kubernetes.io/projected/b92328b7-456b-45ce-8416-765f465ac793-kube-api-access-j82lp\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310701 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-log\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.310864 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-run\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.311201 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-etc-ovs\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.311332 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b92328b7-456b-45ce-8416-765f465ac793-var-lib\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.311427 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-run\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.312297 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-log-ovn\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.312771 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b92328b7-456b-45ce-8416-765f465ac793-scripts\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.314484 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30eebd2b-aed6-4866-bec4-da326d89821c-scripts\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.314714 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30eebd2b-aed6-4866-bec4-da326d89821c-var-run-ovn\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.317443 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30eebd2b-aed6-4866-bec4-da326d89821c-combined-ca-bundle\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.317486 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/30eebd2b-aed6-4866-bec4-da326d89821c-ovn-controller-tls-certs\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.329334 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv62g\" (UniqueName: \"kubernetes.io/projected/30eebd2b-aed6-4866-bec4-da326d89821c-kube-api-access-cv62g\") pod \"ovn-controller-56dth\" (UID: \"30eebd2b-aed6-4866-bec4-da326d89821c\") " pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.336544 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j82lp\" (UniqueName: \"kubernetes.io/projected/b92328b7-456b-45ce-8416-765f465ac793-kube-api-access-j82lp\") pod \"ovn-controller-ovs-jbvmz\" (UID: \"b92328b7-456b-45ce-8416-765f465ac793\") " pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.402086 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56dth" Dec 05 06:09:45 crc kubenswrapper[4865]: I1205 06:09:45.420588 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.491326 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.492593 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.499310 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.499505 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.499683 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cm9ln" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.499905 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.499324 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.502688 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565308 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-config\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565358 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565400 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr97l\" (UniqueName: \"kubernetes.io/projected/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-kube-api-access-kr97l\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565484 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565545 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565589 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565611 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.565629 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.667776 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.667870 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.667905 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.667947 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.667985 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-config\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.668030 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.668071 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr97l\" (UniqueName: \"kubernetes.io/projected/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-kube-api-access-kr97l\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.668111 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.668587 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.668815 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.670061 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-config\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.670335 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.680725 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.681369 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.693070 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr97l\" (UniqueName: \"kubernetes.io/projected/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-kube-api-access-kr97l\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.693854 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.695107 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a7a744-bc5d-4bb1-88a2-d90afeb9fdad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad\") " pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:47 crc kubenswrapper[4865]: I1205 06:09:47.831275 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.059349 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a70babd-c8a6-442f-aa44-d013f3887c93","Type":"ContainerStarted","Data":"7fd48bc7b6fa346e150ca6cf28b6142c3f2e388c202fdcf8219743dfb037179a"} Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.060691 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"05035a7d-0d83-46dd-a889-3db64fb647e8","Type":"ContainerStarted","Data":"0854616449d17758198b68193ac83be3904b86f119b48b708ab9cc19ac1f41e3"} Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.972631 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.974079 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.975971 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.976104 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qw8mf" Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.976227 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.976334 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 06:09:48 crc kubenswrapper[4865]: I1205 06:09:48.990263 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090143 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5400c67c-5f55-47eb-88dc-699ecf76bc95-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090190 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090238 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090276 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5400c67c-5f55-47eb-88dc-699ecf76bc95-config\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090476 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xfx\" (UniqueName: \"kubernetes.io/projected/5400c67c-5f55-47eb-88dc-699ecf76bc95-kube-api-access-k4xfx\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090558 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5400c67c-5f55-47eb-88dc-699ecf76bc95-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090574 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.090591 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.192798 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5400c67c-5f55-47eb-88dc-699ecf76bc95-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.193434 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.193584 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.193765 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5400c67c-5f55-47eb-88dc-699ecf76bc95-config\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.194215 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xfx\" (UniqueName: \"kubernetes.io/projected/5400c67c-5f55-47eb-88dc-699ecf76bc95-kube-api-access-k4xfx\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.194334 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5400c67c-5f55-47eb-88dc-699ecf76bc95-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.194440 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.194531 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.194358 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.195004 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5400c67c-5f55-47eb-88dc-699ecf76bc95-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.196291 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5400c67c-5f55-47eb-88dc-699ecf76bc95-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.195105 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5400c67c-5f55-47eb-88dc-699ecf76bc95-config\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.199542 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.200369 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.201258 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5400c67c-5f55-47eb-88dc-699ecf76bc95-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.212410 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xfx\" (UniqueName: \"kubernetes.io/projected/5400c67c-5f55-47eb-88dc-699ecf76bc95-kube-api-access-k4xfx\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.222843 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5400c67c-5f55-47eb-88dc-699ecf76bc95\") " pod="openstack/ovsdbserver-sb-0" Dec 05 06:09:49 crc kubenswrapper[4865]: I1205 06:09:49.291635 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 06:10:01 crc kubenswrapper[4865]: E1205 06:10:01.360305 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 06:10:01 crc kubenswrapper[4865]: E1205 06:10:01.361469 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2rfxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(8a62a048-0ebe-4e5e-988a-4dde7746af74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:01 crc kubenswrapper[4865]: E1205 06:10:01.363672 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="8a62a048-0ebe-4e5e-988a-4dde7746af74" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.043213 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.043394 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwcd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-pw9js_openstack(273190c6-096d-4098-b426-7fa1854b09f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.044629 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" podUID="273190c6-096d-4098-b426-7fa1854b09f8" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.054895 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.055091 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlgw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-g2cb9_openstack(18c4c2dd-31a3-4b3c-8401-31c37d89ac3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.056641 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" podUID="18c4c2dd-31a3-4b3c-8401-31c37d89ac3d" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.062615 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.062765 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k78f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-pvtgk_openstack(51f79fb1-29a1-479c-8574-0b2d7d549c3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.063963 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" podUID="51f79fb1-29a1-479c-8574-0b2d7d549c3b" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.078862 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.079042 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqj5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-9tvv7_openstack(a4572a2a-3255-485e-91ea-69fe52b1e3a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.081203 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" podUID="a4572a2a-3255-485e-91ea-69fe52b1e3a6" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.202921 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="8a62a048-0ebe-4e5e-988a-4dde7746af74" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.203342 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" podUID="273190c6-096d-4098-b426-7fa1854b09f8" Dec 05 06:10:02 crc kubenswrapper[4865]: E1205 06:10:02.206366 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" podUID="a4572a2a-3255-485e-91ea-69fe52b1e3a6" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.180428 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.180945 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvw2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.182091 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.224148 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.272491 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.272695 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d99nb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(ff4eacf2-62b6-48a0-9650-77e19a6db904): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:10:03 crc kubenswrapper[4865]: E1205 06:10:03.274029 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.321223 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.337781 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.374748 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlgw7\" (UniqueName: \"kubernetes.io/projected/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-kube-api-access-rlgw7\") pod \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.374937 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-dns-svc\") pod \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.375047 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k78f\" (UniqueName: \"kubernetes.io/projected/51f79fb1-29a1-479c-8574-0b2d7d549c3b-kube-api-access-2k78f\") pod \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.375120 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-config\") pod \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\" (UID: \"51f79fb1-29a1-479c-8574-0b2d7d549c3b\") " Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.375201 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-config\") pod \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\" (UID: \"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d\") " Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.376215 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-config" (OuterVolumeSpecName: "config") pod "18c4c2dd-31a3-4b3c-8401-31c37d89ac3d" (UID: "18c4c2dd-31a3-4b3c-8401-31c37d89ac3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.377238 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51f79fb1-29a1-479c-8574-0b2d7d549c3b" (UID: "51f79fb1-29a1-479c-8574-0b2d7d549c3b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.379235 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-config" (OuterVolumeSpecName: "config") pod "51f79fb1-29a1-479c-8574-0b2d7d549c3b" (UID: "51f79fb1-29a1-479c-8574-0b2d7d549c3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.384550 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f79fb1-29a1-479c-8574-0b2d7d549c3b-kube-api-access-2k78f" (OuterVolumeSpecName: "kube-api-access-2k78f") pod "51f79fb1-29a1-479c-8574-0b2d7d549c3b" (UID: "51f79fb1-29a1-479c-8574-0b2d7d549c3b"). InnerVolumeSpecName "kube-api-access-2k78f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.398597 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-kube-api-access-rlgw7" (OuterVolumeSpecName: "kube-api-access-rlgw7") pod "18c4c2dd-31a3-4b3c-8401-31c37d89ac3d" (UID: "18c4c2dd-31a3-4b3c-8401-31c37d89ac3d"). InnerVolumeSpecName "kube-api-access-rlgw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.477033 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k78f\" (UniqueName: \"kubernetes.io/projected/51f79fb1-29a1-479c-8574-0b2d7d549c3b-kube-api-access-2k78f\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.477071 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.477084 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.477097 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlgw7\" (UniqueName: \"kubernetes.io/projected/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d-kube-api-access-rlgw7\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:03 crc kubenswrapper[4865]: I1205 06:10:03.477109 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51f79fb1-29a1-479c-8574-0b2d7d549c3b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.019034 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jbvmz"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.223264 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbvmz" event={"ID":"b92328b7-456b-45ce-8416-765f465ac793","Type":"ContainerStarted","Data":"cb47008fa196561d090d79a596f55835847db9e74f4934be5b479e7fdd550f70"} Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.240253 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" event={"ID":"18c4c2dd-31a3-4b3c-8401-31c37d89ac3d","Type":"ContainerDied","Data":"cdbced58f2381cb8c3cd20251b7f8e5b9e961abf56045d6ffad7ead808dc51f8"} Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.240373 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2cb9" Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.246399 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" event={"ID":"51f79fb1-29a1-479c-8574-0b2d7d549c3b","Type":"ContainerDied","Data":"a461ffb481aeb35da1a66cbfd9551ed494dcca48b9ed992d83ed2ac8e7ccbfc3"} Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.246400 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pvtgk" Dec 05 06:10:04 crc kubenswrapper[4865]: E1205 06:10:04.265022 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.444334 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2cb9"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.451889 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2cb9"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.467885 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pvtgk"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.473811 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pvtgk"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.675281 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.755282 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.812224 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56dth"] Dec 05 06:10:04 crc kubenswrapper[4865]: I1205 06:10:04.902740 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 06:10:04 crc kubenswrapper[4865]: W1205 06:10:04.906418 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43a7a744_bc5d_4bb1_88a2_d90afeb9fdad.slice/crio-4f3b747bda86b35fde3a934ac90c17220a74de36a8560f86350af17d2691d343 WatchSource:0}: Error finding container 4f3b747bda86b35fde3a934ac90c17220a74de36a8560f86350af17d2691d343: Status 404 returned error can't find the container with id 4f3b747bda86b35fde3a934ac90c17220a74de36a8560f86350af17d2691d343 Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.016107 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c4c2dd-31a3-4b3c-8401-31c37d89ac3d" path="/var/lib/kubelet/pods/18c4c2dd-31a3-4b3c-8401-31c37d89ac3d/volumes" Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.016489 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f79fb1-29a1-479c-8574-0b2d7d549c3b" path="/var/lib/kubelet/pods/51f79fb1-29a1-479c-8574-0b2d7d549c3b/volumes" Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.258967 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56dth" event={"ID":"30eebd2b-aed6-4866-bec4-da326d89821c","Type":"ContainerStarted","Data":"0ed647968b497ec8ff25f09f8a96bebeea9c2d2b406a3ca179383f1236bee3bd"} Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.260535 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5400c67c-5f55-47eb-88dc-699ecf76bc95","Type":"ContainerStarted","Data":"2820a97253dd03605f3459ebee99e7b8b87c413c940fd4dab87d12aeb3ea42eb"} Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.262043 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"05035a7d-0d83-46dd-a889-3db64fb647e8","Type":"ContainerStarted","Data":"dd6dbb7ac8f981c58450d6a846ab253b1e4148bf83766950f334fb767f8d3921"} Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.262164 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.263358 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b43a6d5-6af1-413b-bfec-2607a76cc294","Type":"ContainerStarted","Data":"08f986d3d2f8c3b4cc69f7db04e2449ed44b4b1a0c0b93ce522cd40e1d1dc098"} Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.266896 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a70babd-c8a6-442f-aa44-d013f3887c93","Type":"ContainerStarted","Data":"ee0a99043ac5b968d972fe31456826af58ca942462aa6291afded89110a6df39"} Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.269745 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad","Type":"ContainerStarted","Data":"4f3b747bda86b35fde3a934ac90c17220a74de36a8560f86350af17d2691d343"} Dec 05 06:10:05 crc kubenswrapper[4865]: I1205 06:10:05.289401 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=9.784726303 podStartE2EDuration="26.28938536s" podCreationTimestamp="2025-12-05 06:09:39 +0000 UTC" firstStartedPulling="2025-12-05 06:09:47.647709277 +0000 UTC m=+1006.927720499" lastFinishedPulling="2025-12-05 06:10:04.152368334 +0000 UTC m=+1023.432379556" observedRunningTime="2025-12-05 06:10:05.281895976 +0000 UTC m=+1024.561907208" watchObservedRunningTime="2025-12-05 06:10:05.28938536 +0000 UTC m=+1024.569396582" Dec 05 06:10:08 crc kubenswrapper[4865]: I1205 06:10:08.307099 4865 generic.go:334] "Generic (PLEG): container finished" podID="9a70babd-c8a6-442f-aa44-d013f3887c93" containerID="ee0a99043ac5b968d972fe31456826af58ca942462aa6291afded89110a6df39" exitCode=0 Dec 05 06:10:08 crc kubenswrapper[4865]: I1205 06:10:08.307201 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a70babd-c8a6-442f-aa44-d013f3887c93","Type":"ContainerDied","Data":"ee0a99043ac5b968d972fe31456826af58ca942462aa6291afded89110a6df39"} Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.247659 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.328797 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbvmz" event={"ID":"b92328b7-456b-45ce-8416-765f465ac793","Type":"ContainerStarted","Data":"a37a7511630500a175ea2afeb1e9c17e46ec7653c3e80a48522d7388dfbbf3ee"} Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.332777 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56dth" event={"ID":"30eebd2b-aed6-4866-bec4-da326d89821c","Type":"ContainerStarted","Data":"89d5a5b9983525137debc530b267e50d578cbf7e5898359cb10fba0e03cf68a2"} Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.333691 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-56dth" Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.339494 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5400c67c-5f55-47eb-88dc-699ecf76bc95","Type":"ContainerStarted","Data":"b9c26b3785d8f263c132304a37f602024608311d8eac7a9d7ddd82f803c72d98"} Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.343283 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a70babd-c8a6-442f-aa44-d013f3887c93","Type":"ContainerStarted","Data":"adcefdfb21e535e01f24a144f034226e6f06187ff3b36fb242e39db1f0255e98"} Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.390519 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-56dth" podStartSLOduration=22.164601471 podStartE2EDuration="25.390505353s" podCreationTimestamp="2025-12-05 06:09:45 +0000 UTC" firstStartedPulling="2025-12-05 06:10:04.8261127 +0000 UTC m=+1024.106123922" lastFinishedPulling="2025-12-05 06:10:08.052016582 +0000 UTC m=+1027.332027804" observedRunningTime="2025-12-05 06:10:10.388865126 +0000 UTC m=+1029.668876368" watchObservedRunningTime="2025-12-05 06:10:10.390505353 +0000 UTC m=+1029.670516575" Dec 05 06:10:10 crc kubenswrapper[4865]: I1205 06:10:10.419125 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=15.918206239 podStartE2EDuration="32.419108439s" podCreationTimestamp="2025-12-05 06:09:38 +0000 UTC" firstStartedPulling="2025-12-05 06:09:47.642184099 +0000 UTC m=+1006.922195321" lastFinishedPulling="2025-12-05 06:10:04.143086299 +0000 UTC m=+1023.423097521" observedRunningTime="2025-12-05 06:10:10.416527996 +0000 UTC m=+1029.696539248" watchObservedRunningTime="2025-12-05 06:10:10.419108439 +0000 UTC m=+1029.699119681" Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.359497 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b43a6d5-6af1-413b-bfec-2607a76cc294","Type":"ContainerStarted","Data":"e0786d153574d957a17e2fcdce796e99b55e3d061c8fa99019a1541f6b65c006"} Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.360328 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.361328 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad","Type":"ContainerStarted","Data":"64a962b6e6080264f0a01ce5b0e652a86b494f2d6dd0ad55d8a28798b77efa59"} Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.364574 4865 generic.go:334] "Generic (PLEG): container finished" podID="b92328b7-456b-45ce-8416-765f465ac793" containerID="a37a7511630500a175ea2afeb1e9c17e46ec7653c3e80a48522d7388dfbbf3ee" exitCode=0 Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.364900 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbvmz" event={"ID":"b92328b7-456b-45ce-8416-765f465ac793","Type":"ContainerDied","Data":"a37a7511630500a175ea2afeb1e9c17e46ec7653c3e80a48522d7388dfbbf3ee"} Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.382556 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.155519759 podStartE2EDuration="30.382534936s" podCreationTimestamp="2025-12-05 06:09:42 +0000 UTC" firstStartedPulling="2025-12-05 06:10:04.696083949 +0000 UTC m=+1023.976095171" lastFinishedPulling="2025-12-05 06:10:11.923099126 +0000 UTC m=+1031.203110348" observedRunningTime="2025-12-05 06:10:12.37462716 +0000 UTC m=+1031.654638382" watchObservedRunningTime="2025-12-05 06:10:12.382534936 +0000 UTC m=+1031.662546158" Dec 05 06:10:12 crc kubenswrapper[4865]: I1205 06:10:12.920579 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9tvv7"] Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.041055 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-x8c88"] Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.042527 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-x8c88"] Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.042631 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.152956 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-config\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.153005 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vldh\" (UniqueName: \"kubernetes.io/projected/3c3df550-e8c3-4437-8299-794ff6295eed-kube-api-access-6vldh\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.153027 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.255247 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-config\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.255307 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vldh\" (UniqueName: \"kubernetes.io/projected/3c3df550-e8c3-4437-8299-794ff6295eed-kube-api-access-6vldh\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.255330 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.256156 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.256965 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-config\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.309759 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vldh\" (UniqueName: \"kubernetes.io/projected/3c3df550-e8c3-4437-8299-794ff6295eed-kube-api-access-6vldh\") pod \"dnsmasq-dns-7cb5889db5-x8c88\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.400486 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbvmz" event={"ID":"b92328b7-456b-45ce-8416-765f465ac793","Type":"ContainerStarted","Data":"c2d2beb6883e13a08445a9aa9a8779c0144b68c2e3635b253794ba157b562ef1"} Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.406058 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.607495 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.661590 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-dns-svc\") pod \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.661673 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqj5r\" (UniqueName: \"kubernetes.io/projected/a4572a2a-3255-485e-91ea-69fe52b1e3a6-kube-api-access-zqj5r\") pod \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.661890 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-config\") pod \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\" (UID: \"a4572a2a-3255-485e-91ea-69fe52b1e3a6\") " Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.662793 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-config" (OuterVolumeSpecName: "config") pod "a4572a2a-3255-485e-91ea-69fe52b1e3a6" (UID: "a4572a2a-3255-485e-91ea-69fe52b1e3a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.663271 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4572a2a-3255-485e-91ea-69fe52b1e3a6" (UID: "a4572a2a-3255-485e-91ea-69fe52b1e3a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.686549 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4572a2a-3255-485e-91ea-69fe52b1e3a6-kube-api-access-zqj5r" (OuterVolumeSpecName: "kube-api-access-zqj5r") pod "a4572a2a-3255-485e-91ea-69fe52b1e3a6" (UID: "a4572a2a-3255-485e-91ea-69fe52b1e3a6"). InnerVolumeSpecName "kube-api-access-zqj5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.763685 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.763720 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqj5r\" (UniqueName: \"kubernetes.io/projected/a4572a2a-3255-485e-91ea-69fe52b1e3a6-kube-api-access-zqj5r\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:13 crc kubenswrapper[4865]: I1205 06:10:13.763735 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4572a2a-3255-485e-91ea-69fe52b1e3a6-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.028854 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-x8c88"] Dec 05 06:10:14 crc kubenswrapper[4865]: W1205 06:10:14.062055 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c3df550_e8c3_4437_8299_794ff6295eed.slice/crio-a1cc62fbbc07037ce370be68fff10c21cb85b56d1c6e66ff9b958b4794184c25 WatchSource:0}: Error finding container a1cc62fbbc07037ce370be68fff10c21cb85b56d1c6e66ff9b958b4794184c25: Status 404 returned error can't find the container with id a1cc62fbbc07037ce370be68fff10c21cb85b56d1c6e66ff9b958b4794184c25 Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.187761 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.203235 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.203417 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.206429 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.206637 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.210317 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2bnpd" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.210529 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.279131 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-lock\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.279283 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.279382 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7msjj\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-kube-api-access-7msjj\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.279426 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.279720 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-cache\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.381681 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-lock\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.381791 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.381969 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7msjj\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-kube-api-access-7msjj\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.382008 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.382147 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-cache\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.382404 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-lock\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.382566 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-cache\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.382598 4865 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.382613 4865 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.382655 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift podName:30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f nodeName:}" failed. No retries permitted until 2025-12-05 06:10:14.882639351 +0000 UTC m=+1034.162650573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift") pod "swift-storage-0" (UID: "30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f") : configmap "swift-ring-files" not found Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.382943 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.403786 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7msjj\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-kube-api-access-7msjj\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.415606 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" event={"ID":"3c3df550-e8c3-4437-8299-794ff6295eed","Type":"ContainerStarted","Data":"a1cc62fbbc07037ce370be68fff10c21cb85b56d1c6e66ff9b958b4794184c25"} Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.419134 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbvmz" event={"ID":"b92328b7-456b-45ce-8416-765f465ac793","Type":"ContainerStarted","Data":"2907198850ab753d8a0fe6a11ce56ebe6bdbe2e0e26e8eb0b25cb4fd358dbc1a"} Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.420729 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.420847 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.422131 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.447816 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.452268 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jbvmz" podStartSLOduration=25.490805306 podStartE2EDuration="29.452241677s" podCreationTimestamp="2025-12-05 06:09:45 +0000 UTC" firstStartedPulling="2025-12-05 06:10:04.090930311 +0000 UTC m=+1023.370941533" lastFinishedPulling="2025-12-05 06:10:08.052366682 +0000 UTC m=+1027.332377904" observedRunningTime="2025-12-05 06:10:14.439082481 +0000 UTC m=+1033.719093703" watchObservedRunningTime="2025-12-05 06:10:14.452241677 +0000 UTC m=+1033.732252889" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.453437 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9tvv7" event={"ID":"a4572a2a-3255-485e-91ea-69fe52b1e3a6","Type":"ContainerDied","Data":"878d29538bb298d3c48b6e4867c744f5e3bcebc0add1fea5b4ae94b94ff79d1a"} Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.611780 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9tvv7"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.617446 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9tvv7"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.821458 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6x5z9"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.823413 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.830229 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.830441 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.831121 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.856552 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6x5z9"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.879939 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-6x5z9"] Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.881699 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-kxngd ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-6x5z9" podUID="b2036e55-5f77-4d9b-bc89-563e450c2a9a" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.887889 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gx5lg"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.890190 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891394 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-combined-ca-bundle\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891441 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-ring-data-devices\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891467 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-scripts\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891502 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b2036e55-5f77-4d9b-bc89-563e450c2a9a-etc-swift\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891518 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-dispersionconf\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891547 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891568 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxngd\" (UniqueName: \"kubernetes.io/projected/b2036e55-5f77-4d9b-bc89-563e450c2a9a-kube-api-access-kxngd\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.891611 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-swiftconf\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.891857 4865 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.891873 4865 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 06:10:14 crc kubenswrapper[4865]: E1205 06:10:14.891920 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift podName:30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f nodeName:}" failed. No retries permitted until 2025-12-05 06:10:15.891905952 +0000 UTC m=+1035.171917164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift") pod "swift-storage-0" (UID: "30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f") : configmap "swift-ring-files" not found Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.932346 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gx5lg"] Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.992556 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b2036e55-5f77-4d9b-bc89-563e450c2a9a-etc-swift\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.992601 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-dispersionconf\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.992624 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7g2\" (UniqueName: \"kubernetes.io/projected/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-kube-api-access-7z7g2\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.992659 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxngd\" (UniqueName: \"kubernetes.io/projected/b2036e55-5f77-4d9b-bc89-563e450c2a9a-kube-api-access-kxngd\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.993088 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b2036e55-5f77-4d9b-bc89-563e450c2a9a-etc-swift\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.994542 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-ring-data-devices\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.994661 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-swiftconf\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.994692 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-etc-swift\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995213 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-combined-ca-bundle\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995552 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-swiftconf\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995608 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-dispersionconf\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995644 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-ring-data-devices\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995687 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-scripts\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995759 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-combined-ca-bundle\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.995793 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-scripts\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.996517 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-ring-data-devices\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:14 crc kubenswrapper[4865]: I1205 06:10:14.996614 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-scripts\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.001566 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-dispersionconf\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.004342 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-combined-ca-bundle\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.004618 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-swiftconf\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.010529 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxngd\" (UniqueName: \"kubernetes.io/projected/b2036e55-5f77-4d9b-bc89-563e450c2a9a-kube-api-access-kxngd\") pod \"swift-ring-rebalance-6x5z9\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.048650 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4572a2a-3255-485e-91ea-69fe52b1e3a6" path="/var/lib/kubelet/pods/a4572a2a-3255-485e-91ea-69fe52b1e3a6/volumes" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.098836 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-combined-ca-bundle\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.098879 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-scripts\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.098920 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7g2\" (UniqueName: \"kubernetes.io/projected/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-kube-api-access-7z7g2\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.098951 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-ring-data-devices\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.099042 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-etc-swift\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.099122 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-swiftconf\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.099160 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-dispersionconf\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.101925 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-etc-swift\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.103402 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-scripts\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.103961 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-ring-data-devices\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.110466 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-swiftconf\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.118278 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-dispersionconf\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.120098 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-combined-ca-bundle\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.120343 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7g2\" (UniqueName: \"kubernetes.io/projected/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-kube-api-access-7z7g2\") pod \"swift-ring-rebalance-gx5lg\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.240282 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.456994 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.476977 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607196 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-combined-ca-bundle\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607315 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxngd\" (UniqueName: \"kubernetes.io/projected/b2036e55-5f77-4d9b-bc89-563e450c2a9a-kube-api-access-kxngd\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607371 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b2036e55-5f77-4d9b-bc89-563e450c2a9a-etc-swift\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607400 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-dispersionconf\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607444 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-scripts\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607475 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-ring-data-devices\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.607500 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-swiftconf\") pod \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\" (UID: \"b2036e55-5f77-4d9b-bc89-563e450c2a9a\") " Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.608458 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2036e55-5f77-4d9b-bc89-563e450c2a9a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.611032 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-scripts" (OuterVolumeSpecName: "scripts") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.611143 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.611317 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.615963 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.620130 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.633218 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2036e55-5f77-4d9b-bc89-563e450c2a9a-kube-api-access-kxngd" (OuterVolumeSpecName: "kube-api-access-kxngd") pod "b2036e55-5f77-4d9b-bc89-563e450c2a9a" (UID: "b2036e55-5f77-4d9b-bc89-563e450c2a9a"). InnerVolumeSpecName "kube-api-access-kxngd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722414 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxngd\" (UniqueName: \"kubernetes.io/projected/b2036e55-5f77-4d9b-bc89-563e450c2a9a-kube-api-access-kxngd\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722463 4865 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b2036e55-5f77-4d9b-bc89-563e450c2a9a-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722478 4865 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722489 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722500 4865 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b2036e55-5f77-4d9b-bc89-563e450c2a9a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722512 4865 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.722523 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2036e55-5f77-4d9b-bc89-563e450c2a9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:15 crc kubenswrapper[4865]: I1205 06:10:15.930593 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:15 crc kubenswrapper[4865]: E1205 06:10:15.931102 4865 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 06:10:15 crc kubenswrapper[4865]: E1205 06:10:15.931116 4865 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 06:10:15 crc kubenswrapper[4865]: E1205 06:10:15.931159 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift podName:30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f nodeName:}" failed. No retries permitted until 2025-12-05 06:10:17.931143157 +0000 UTC m=+1037.211154379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift") pod "swift-storage-0" (UID: "30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f") : configmap "swift-ring-files" not found Dec 05 06:10:16 crc kubenswrapper[4865]: I1205 06:10:16.136854 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gx5lg"] Dec 05 06:10:16 crc kubenswrapper[4865]: I1205 06:10:16.465791 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gx5lg" event={"ID":"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea","Type":"ContainerStarted","Data":"6cdd78b0b2433e9ffdc7ca5f5e04405cdf7e3f6e798f02aa03807ea34c5d384d"} Dec 05 06:10:16 crc kubenswrapper[4865]: I1205 06:10:16.465839 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6x5z9" Dec 05 06:10:16 crc kubenswrapper[4865]: I1205 06:10:16.516324 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-6x5z9"] Dec 05 06:10:16 crc kubenswrapper[4865]: I1205 06:10:16.526491 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-6x5z9"] Dec 05 06:10:17 crc kubenswrapper[4865]: I1205 06:10:17.016389 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2036e55-5f77-4d9b-bc89-563e450c2a9a" path="/var/lib/kubelet/pods/b2036e55-5f77-4d9b-bc89-563e450c2a9a/volumes" Dec 05 06:10:17 crc kubenswrapper[4865]: E1205 06:10:17.567663 4865 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.147:46530->38.102.83.147:33339: write tcp 38.102.83.147:46530->38.102.83.147:33339: write: broken pipe Dec 05 06:10:17 crc kubenswrapper[4865]: I1205 06:10:17.986614 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:17 crc kubenswrapper[4865]: E1205 06:10:17.987007 4865 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 06:10:17 crc kubenswrapper[4865]: E1205 06:10:17.987035 4865 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 06:10:17 crc kubenswrapper[4865]: E1205 06:10:17.987117 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift podName:30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f nodeName:}" failed. No retries permitted until 2025-12-05 06:10:21.987090495 +0000 UTC m=+1041.267101737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift") pod "swift-storage-0" (UID: "30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f") : configmap "swift-ring-files" not found Dec 05 06:10:19 crc kubenswrapper[4865]: I1205 06:10:19.502005 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8a62a048-0ebe-4e5e-988a-4dde7746af74","Type":"ContainerStarted","Data":"7e0be6fd60b0b6baf392b35658c5b42b2081be94921a74a76dc733fb0f8485b2"} Dec 05 06:10:19 crc kubenswrapper[4865]: I1205 06:10:19.871955 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 06:10:19 crc kubenswrapper[4865]: I1205 06:10:19.872296 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 06:10:19 crc kubenswrapper[4865]: I1205 06:10:19.976035 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 06:10:20 crc kubenswrapper[4865]: I1205 06:10:20.597488 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 06:10:22 crc kubenswrapper[4865]: I1205 06:10:22.063238 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:22 crc kubenswrapper[4865]: E1205 06:10:22.063515 4865 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 06:10:22 crc kubenswrapper[4865]: E1205 06:10:22.063543 4865 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 06:10:22 crc kubenswrapper[4865]: E1205 06:10:22.063599 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift podName:30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f nodeName:}" failed. No retries permitted until 2025-12-05 06:10:30.063581277 +0000 UTC m=+1049.343592499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift") pod "swift-storage-0" (UID: "30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f") : configmap "swift-ring-files" not found Dec 05 06:10:22 crc kubenswrapper[4865]: I1205 06:10:22.668753 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 06:10:26 crc kubenswrapper[4865]: I1205 06:10:26.595626 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" event={"ID":"3c3df550-e8c3-4437-8299-794ff6295eed","Type":"ContainerStarted","Data":"f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6"} Dec 05 06:10:26 crc kubenswrapper[4865]: I1205 06:10:26.597954 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"43a7a744-bc5d-4bb1-88a2-d90afeb9fdad","Type":"ContainerStarted","Data":"f6153c8038c24367726adb7996dc09a1fc72f92d8c114188d75d7a534abcdd14"} Dec 05 06:10:26 crc kubenswrapper[4865]: I1205 06:10:26.600249 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" event={"ID":"273190c6-096d-4098-b426-7fa1854b09f8","Type":"ContainerStarted","Data":"9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8"} Dec 05 06:10:26 crc kubenswrapper[4865]: I1205 06:10:26.603809 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5400c67c-5f55-47eb-88dc-699ecf76bc95","Type":"ContainerStarted","Data":"594d5e175a38fd3e402ef47ff5b76c34f00afeb7f8d3686be35bc69c73bf2ba6"} Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.613238 4865 generic.go:334] "Generic (PLEG): container finished" podID="3c3df550-e8c3-4437-8299-794ff6295eed" containerID="f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6" exitCode=0 Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.613312 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" event={"ID":"3c3df550-e8c3-4437-8299-794ff6295eed","Type":"ContainerDied","Data":"f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6"} Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.618245 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3","Type":"ContainerStarted","Data":"49d67739f31cafcc87fc6c330cb79fbdef086b770232befb583085154eba0839"} Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.621160 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff4eacf2-62b6-48a0-9650-77e19a6db904","Type":"ContainerStarted","Data":"c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef"} Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.626090 4865 generic.go:334] "Generic (PLEG): container finished" podID="273190c6-096d-4098-b426-7fa1854b09f8" containerID="9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8" exitCode=0 Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.626188 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" event={"ID":"273190c6-096d-4098-b426-7fa1854b09f8","Type":"ContainerDied","Data":"9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8"} Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.713496 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.844300258 podStartE2EDuration="40.713477933s" podCreationTimestamp="2025-12-05 06:09:47 +0000 UTC" firstStartedPulling="2025-12-05 06:10:04.77774795 +0000 UTC m=+1024.057759172" lastFinishedPulling="2025-12-05 06:10:24.646925615 +0000 UTC m=+1043.926936847" observedRunningTime="2025-12-05 06:10:27.704344165 +0000 UTC m=+1046.984355387" watchObservedRunningTime="2025-12-05 06:10:27.713477933 +0000 UTC m=+1046.993489145" Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.747416 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.091592976 podStartE2EDuration="41.747393763s" podCreationTimestamp="2025-12-05 06:09:46 +0000 UTC" firstStartedPulling="2025-12-05 06:10:04.909350435 +0000 UTC m=+1024.189361657" lastFinishedPulling="2025-12-05 06:10:24.565151212 +0000 UTC m=+1043.845162444" observedRunningTime="2025-12-05 06:10:27.729307661 +0000 UTC m=+1047.009318893" watchObservedRunningTime="2025-12-05 06:10:27.747393763 +0000 UTC m=+1047.027404985" Dec 05 06:10:27 crc kubenswrapper[4865]: I1205 06:10:27.833006 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 06:10:28 crc kubenswrapper[4865]: I1205 06:10:28.293153 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 06:10:28 crc kubenswrapper[4865]: I1205 06:10:28.329736 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 06:10:28 crc kubenswrapper[4865]: I1205 06:10:28.635770 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 06:10:28 crc kubenswrapper[4865]: I1205 06:10:28.713197 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 06:10:28 crc kubenswrapper[4865]: I1205 06:10:28.999021 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pw9js"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.031209 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-b4c2s"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.032464 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.034664 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.036372 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-cj64h"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.037714 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.040056 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.075594 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-cj64h"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.085965 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-b4c2s"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.120791 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034bb156-f8de-4fb1-bb44-b952c3f6a019-config\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.120890 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-config\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.120934 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/034bb156-f8de-4fb1-bb44-b952c3f6a019-combined-ca-bundle\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121015 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121059 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshnb\" (UniqueName: \"kubernetes.io/projected/99c0930b-ba5f-45ed-a20a-16346e193307-kube-api-access-qshnb\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121104 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/034bb156-f8de-4fb1-bb44-b952c3f6a019-ovs-rundir\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121150 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/034bb156-f8de-4fb1-bb44-b952c3f6a019-ovn-rundir\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121183 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121213 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvvsw\" (UniqueName: \"kubernetes.io/projected/034bb156-f8de-4fb1-bb44-b952c3f6a019-kube-api-access-xvvsw\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.121247 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/034bb156-f8de-4fb1-bb44-b952c3f6a019-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223101 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-config\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223165 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/034bb156-f8de-4fb1-bb44-b952c3f6a019-combined-ca-bundle\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223202 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223232 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qshnb\" (UniqueName: \"kubernetes.io/projected/99c0930b-ba5f-45ed-a20a-16346e193307-kube-api-access-qshnb\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223264 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/034bb156-f8de-4fb1-bb44-b952c3f6a019-ovs-rundir\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223285 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/034bb156-f8de-4fb1-bb44-b952c3f6a019-ovn-rundir\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223309 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223330 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvvsw\" (UniqueName: \"kubernetes.io/projected/034bb156-f8de-4fb1-bb44-b952c3f6a019-kube-api-access-xvvsw\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223351 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/034bb156-f8de-4fb1-bb44-b952c3f6a019-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223389 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034bb156-f8de-4fb1-bb44-b952c3f6a019-config\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223643 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/034bb156-f8de-4fb1-bb44-b952c3f6a019-ovs-rundir\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.223933 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/034bb156-f8de-4fb1-bb44-b952c3f6a019-ovn-rundir\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.224060 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-config\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.224180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034bb156-f8de-4fb1-bb44-b952c3f6a019-config\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.224177 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.224879 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.227865 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/034bb156-f8de-4fb1-bb44-b952c3f6a019-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.228414 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/034bb156-f8de-4fb1-bb44-b952c3f6a019-combined-ca-bundle\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.255040 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvvsw\" (UniqueName: \"kubernetes.io/projected/034bb156-f8de-4fb1-bb44-b952c3f6a019-kube-api-access-xvvsw\") pod \"ovn-controller-metrics-b4c2s\" (UID: \"034bb156-f8de-4fb1-bb44-b952c3f6a019\") " pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.261088 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshnb\" (UniqueName: \"kubernetes.io/projected/99c0930b-ba5f-45ed-a20a-16346e193307-kube-api-access-qshnb\") pod \"dnsmasq-dns-8cc7fc4dc-cj64h\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.360336 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-b4c2s" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.382209 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.619307 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-x8c88"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.686805 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" event={"ID":"273190c6-096d-4098-b426-7fa1854b09f8","Type":"ContainerStarted","Data":"c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29"} Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.687151 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.703146 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" event={"ID":"3c3df550-e8c3-4437-8299-794ff6295eed","Type":"ContainerStarted","Data":"82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9"} Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.703810 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.719702 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gx5lg" event={"ID":"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea","Type":"ContainerStarted","Data":"14f3bc4ef563ead7fe047b98ed74265ffceb7b4171894452d833dd4aa680877d"} Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.719769 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hrq8n"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.731800 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.738075 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.776676 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hrq8n"] Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.797417 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" podStartSLOduration=6.34199747 podStartE2EDuration="54.79737805s" podCreationTimestamp="2025-12-05 06:09:35 +0000 UTC" firstStartedPulling="2025-12-05 06:09:36.299712626 +0000 UTC m=+995.579723848" lastFinishedPulling="2025-12-05 06:10:24.755093206 +0000 UTC m=+1044.035104428" observedRunningTime="2025-12-05 06:10:29.752175711 +0000 UTC m=+1049.032186933" watchObservedRunningTime="2025-12-05 06:10:29.79737805 +0000 UTC m=+1049.077389272" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.834446 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.839401 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.839495 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.839532 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.839633 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmzm\" (UniqueName: \"kubernetes.io/projected/2c8cda95-9251-4c67-90f3-0cc090868b6f-kube-api-access-ccmzm\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.840552 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-config\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.856846 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" podStartSLOduration=7.26912463 podStartE2EDuration="17.856802201s" podCreationTimestamp="2025-12-05 06:10:12 +0000 UTC" firstStartedPulling="2025-12-05 06:10:14.067680613 +0000 UTC m=+1033.347691835" lastFinishedPulling="2025-12-05 06:10:24.655358184 +0000 UTC m=+1043.935369406" observedRunningTime="2025-12-05 06:10:29.834169971 +0000 UTC m=+1049.114181193" watchObservedRunningTime="2025-12-05 06:10:29.856802201 +0000 UTC m=+1049.136813423" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.886141 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.915665 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gx5lg" podStartSLOduration=3.62314004 podStartE2EDuration="15.915641676s" podCreationTimestamp="2025-12-05 06:10:14 +0000 UTC" firstStartedPulling="2025-12-05 06:10:16.151412323 +0000 UTC m=+1035.431423545" lastFinishedPulling="2025-12-05 06:10:28.443913959 +0000 UTC m=+1047.723925181" observedRunningTime="2025-12-05 06:10:29.853527269 +0000 UTC m=+1049.133538501" watchObservedRunningTime="2025-12-05 06:10:29.915641676 +0000 UTC m=+1049.195652898" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.941870 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-config\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.942212 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.942275 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.942299 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.942362 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmzm\" (UniqueName: \"kubernetes.io/projected/2c8cda95-9251-4c67-90f3-0cc090868b6f-kube-api-access-ccmzm\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.943989 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-config\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.944251 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.944495 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.945107 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:29 crc kubenswrapper[4865]: I1205 06:10:29.969588 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmzm\" (UniqueName: \"kubernetes.io/projected/2c8cda95-9251-4c67-90f3-0cc090868b6f-kube-api-access-ccmzm\") pod \"dnsmasq-dns-b8fbc5445-hrq8n\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.065143 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-cj64h"] Dec 05 06:10:30 crc kubenswrapper[4865]: W1205 06:10:30.073096 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99c0930b_ba5f_45ed_a20a_16346e193307.slice/crio-81144be3f22a20e103a202245b0089aa5c7e12b60ac75b733f963307267507d6 WatchSource:0}: Error finding container 81144be3f22a20e103a202245b0089aa5c7e12b60ac75b733f963307267507d6: Status 404 returned error can't find the container with id 81144be3f22a20e103a202245b0089aa5c7e12b60ac75b733f963307267507d6 Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.108469 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.131852 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-b4c2s"] Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.145511 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:30 crc kubenswrapper[4865]: E1205 06:10:30.146491 4865 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 06:10:30 crc kubenswrapper[4865]: E1205 06:10:30.146513 4865 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 06:10:30 crc kubenswrapper[4865]: E1205 06:10:30.146552 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift podName:30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f nodeName:}" failed. No retries permitted until 2025-12-05 06:10:46.146536868 +0000 UTC m=+1065.426548090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift") pod "swift-storage-0" (UID: "30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f") : configmap "swift-ring-files" not found Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.579618 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hrq8n"] Dec 05 06:10:30 crc kubenswrapper[4865]: W1205 06:10:30.581116 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c8cda95_9251_4c67_90f3_0cc090868b6f.slice/crio-7c5bca76f821305550cf43ce45bd89989a8e6314a4f290a694e788b8e60ad1c2 WatchSource:0}: Error finding container 7c5bca76f821305550cf43ce45bd89989a8e6314a4f290a694e788b8e60ad1c2: Status 404 returned error can't find the container with id 7c5bca76f821305550cf43ce45bd89989a8e6314a4f290a694e788b8e60ad1c2 Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.715528 4865 generic.go:334] "Generic (PLEG): container finished" podID="8a62a048-0ebe-4e5e-988a-4dde7746af74" containerID="7e0be6fd60b0b6baf392b35658c5b42b2081be94921a74a76dc733fb0f8485b2" exitCode=0 Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.715600 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8a62a048-0ebe-4e5e-988a-4dde7746af74","Type":"ContainerDied","Data":"7e0be6fd60b0b6baf392b35658c5b42b2081be94921a74a76dc733fb0f8485b2"} Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.720743 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-b4c2s" event={"ID":"034bb156-f8de-4fb1-bb44-b952c3f6a019","Type":"ContainerStarted","Data":"339dfcf84879362ff6db04218fe22867b0c1745f5a4b21d03a985aadd6161001"} Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.720781 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-b4c2s" event={"ID":"034bb156-f8de-4fb1-bb44-b952c3f6a019","Type":"ContainerStarted","Data":"c527d63a1e98acd533948d38e3ec5f155b3db7a44f2299e00228e9845a29773a"} Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.727992 4865 generic.go:334] "Generic (PLEG): container finished" podID="99c0930b-ba5f-45ed-a20a-16346e193307" containerID="a12f75413fb86a56001913166df08b000e51cd97f02c5cf3e2621b1fa9181091" exitCode=0 Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.728621 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" event={"ID":"99c0930b-ba5f-45ed-a20a-16346e193307","Type":"ContainerDied","Data":"a12f75413fb86a56001913166df08b000e51cd97f02c5cf3e2621b1fa9181091"} Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.728656 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" event={"ID":"99c0930b-ba5f-45ed-a20a-16346e193307","Type":"ContainerStarted","Data":"81144be3f22a20e103a202245b0089aa5c7e12b60ac75b733f963307267507d6"} Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.742121 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" podUID="273190c6-096d-4098-b426-7fa1854b09f8" containerName="dnsmasq-dns" containerID="cri-o://c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29" gracePeriod=10 Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.742268 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" event={"ID":"2c8cda95-9251-4c67-90f3-0cc090868b6f","Type":"ContainerStarted","Data":"7c5bca76f821305550cf43ce45bd89989a8e6314a4f290a694e788b8e60ad1c2"} Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.743278 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" containerName="dnsmasq-dns" containerID="cri-o://82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9" gracePeriod=10 Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.813323 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 06:10:30 crc kubenswrapper[4865]: I1205 06:10:30.817569 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-b4c2s" podStartSLOduration=1.817546643 podStartE2EDuration="1.817546643s" podCreationTimestamp="2025-12-05 06:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:30.813671463 +0000 UTC m=+1050.093682685" watchObservedRunningTime="2025-12-05 06:10:30.817546643 +0000 UTC m=+1050.097557865" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.239725 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.241403 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.251309 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.251398 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q2x4z" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.251626 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.251965 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.267267 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.302243 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385268 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385325 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7ebd484-c0dc-45cf-a057-46cb8f76f212-config\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385348 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ebd484-c0dc-45cf-a057-46cb8f76f212-scripts\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385535 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385634 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385660 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwm7k\" (UniqueName: \"kubernetes.io/projected/c7ebd484-c0dc-45cf-a057-46cb8f76f212-kube-api-access-hwm7k\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.385695 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7ebd484-c0dc-45cf-a057-46cb8f76f212-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.488546 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-dns-svc\") pod \"273190c6-096d-4098-b426-7fa1854b09f8\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.488655 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwcd4\" (UniqueName: \"kubernetes.io/projected/273190c6-096d-4098-b426-7fa1854b09f8-kube-api-access-jwcd4\") pod \"273190c6-096d-4098-b426-7fa1854b09f8\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.488697 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-config\") pod \"273190c6-096d-4098-b426-7fa1854b09f8\" (UID: \"273190c6-096d-4098-b426-7fa1854b09f8\") " Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489074 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489095 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwm7k\" (UniqueName: \"kubernetes.io/projected/c7ebd484-c0dc-45cf-a057-46cb8f76f212-kube-api-access-hwm7k\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489123 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7ebd484-c0dc-45cf-a057-46cb8f76f212-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489172 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489191 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7ebd484-c0dc-45cf-a057-46cb8f76f212-config\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489205 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ebd484-c0dc-45cf-a057-46cb8f76f212-scripts\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.489267 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.495221 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.496051 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7ebd484-c0dc-45cf-a057-46cb8f76f212-config\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.496801 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ebd484-c0dc-45cf-a057-46cb8f76f212-scripts\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.497180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7ebd484-c0dc-45cf-a057-46cb8f76f212-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.508964 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.513166 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273190c6-096d-4098-b426-7fa1854b09f8-kube-api-access-jwcd4" (OuterVolumeSpecName: "kube-api-access-jwcd4") pod "273190c6-096d-4098-b426-7fa1854b09f8" (UID: "273190c6-096d-4098-b426-7fa1854b09f8"). InnerVolumeSpecName "kube-api-access-jwcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.514491 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwm7k\" (UniqueName: \"kubernetes.io/projected/c7ebd484-c0dc-45cf-a057-46cb8f76f212-kube-api-access-hwm7k\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.524606 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ebd484-c0dc-45cf-a057-46cb8f76f212-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c7ebd484-c0dc-45cf-a057-46cb8f76f212\") " pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.556285 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "273190c6-096d-4098-b426-7fa1854b09f8" (UID: "273190c6-096d-4098-b426-7fa1854b09f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.587612 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-config" (OuterVolumeSpecName: "config") pod "273190c6-096d-4098-b426-7fa1854b09f8" (UID: "273190c6-096d-4098-b426-7fa1854b09f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.587653 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.590752 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.590779 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwcd4\" (UniqueName: \"kubernetes.io/projected/273190c6-096d-4098-b426-7fa1854b09f8-kube-api-access-jwcd4\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.590792 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273190c6-096d-4098-b426-7fa1854b09f8-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.620042 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.758046 4865 generic.go:334] "Generic (PLEG): container finished" podID="3c3df550-e8c3-4437-8299-794ff6295eed" containerID="82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9" exitCode=0 Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.758101 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.758147 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" event={"ID":"3c3df550-e8c3-4437-8299-794ff6295eed","Type":"ContainerDied","Data":"82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.758173 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-x8c88" event={"ID":"3c3df550-e8c3-4437-8299-794ff6295eed","Type":"ContainerDied","Data":"a1cc62fbbc07037ce370be68fff10c21cb85b56d1c6e66ff9b958b4794184c25"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.758190 4865 scope.go:117] "RemoveContainer" containerID="82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.762085 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8a62a048-0ebe-4e5e-988a-4dde7746af74","Type":"ContainerStarted","Data":"352c685875d1e23eb3b2de72b49e99ab14eefeb3c6f6657f35834261f65ef664"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.769442 4865 generic.go:334] "Generic (PLEG): container finished" podID="273190c6-096d-4098-b426-7fa1854b09f8" containerID="c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29" exitCode=0 Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.769510 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" event={"ID":"273190c6-096d-4098-b426-7fa1854b09f8","Type":"ContainerDied","Data":"c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.769535 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" event={"ID":"273190c6-096d-4098-b426-7fa1854b09f8","Type":"ContainerDied","Data":"a9431d9e41571c77961e5038e99f6820c8b44de0c2a70c406f095d7380deb848"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.769610 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pw9js" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.775790 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" event={"ID":"99c0930b-ba5f-45ed-a20a-16346e193307","Type":"ContainerStarted","Data":"461eafee70f13ed02576ce30b870d8f6055f13b32a33d66325a1c7757824aed6"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.776007 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.781935 4865 generic.go:334] "Generic (PLEG): container finished" podID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerID="153fd74b9be2b5b1e3788e708193984c3182264607700e1f502862150b4126fa" exitCode=0 Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.783218 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" event={"ID":"2c8cda95-9251-4c67-90f3-0cc090868b6f","Type":"ContainerDied","Data":"153fd74b9be2b5b1e3788e708193984c3182264607700e1f502862150b4126fa"} Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.796708 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-config\") pod \"3c3df550-e8c3-4437-8299-794ff6295eed\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.796997 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-dns-svc\") pod \"3c3df550-e8c3-4437-8299-794ff6295eed\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.797127 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vldh\" (UniqueName: \"kubernetes.io/projected/3c3df550-e8c3-4437-8299-794ff6295eed-kube-api-access-6vldh\") pod \"3c3df550-e8c3-4437-8299-794ff6295eed\" (UID: \"3c3df550-e8c3-4437-8299-794ff6295eed\") " Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.801017 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3df550-e8c3-4437-8299-794ff6295eed-kube-api-access-6vldh" (OuterVolumeSpecName: "kube-api-access-6vldh") pod "3c3df550-e8c3-4437-8299-794ff6295eed" (UID: "3c3df550-e8c3-4437-8299-794ff6295eed"). InnerVolumeSpecName "kube-api-access-6vldh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.822117 4865 scope.go:117] "RemoveContainer" containerID="f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.874808 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371981.97999 podStartE2EDuration="54.874785074s" podCreationTimestamp="2025-12-05 06:09:37 +0000 UTC" firstStartedPulling="2025-12-05 06:09:39.4447174 +0000 UTC m=+998.724728632" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:31.798952139 +0000 UTC m=+1051.078963371" watchObservedRunningTime="2025-12-05 06:10:31.874785074 +0000 UTC m=+1051.154796296" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.889601 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" podStartSLOduration=2.889580513 podStartE2EDuration="2.889580513s" podCreationTimestamp="2025-12-05 06:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:31.863890486 +0000 UTC m=+1051.143901708" watchObservedRunningTime="2025-12-05 06:10:31.889580513 +0000 UTC m=+1051.169591745" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.891566 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c3df550-e8c3-4437-8299-794ff6295eed" (UID: "3c3df550-e8c3-4437-8299-794ff6295eed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.900652 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.900687 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vldh\" (UniqueName: \"kubernetes.io/projected/3c3df550-e8c3-4437-8299-794ff6295eed-kube-api-access-6vldh\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.901368 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pw9js"] Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.906156 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-config" (OuterVolumeSpecName: "config") pod "3c3df550-e8c3-4437-8299-794ff6295eed" (UID: "3c3df550-e8c3-4437-8299-794ff6295eed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.906493 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pw9js"] Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.906761 4865 scope.go:117] "RemoveContainer" containerID="82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9" Dec 05 06:10:31 crc kubenswrapper[4865]: E1205 06:10:31.907399 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9\": container with ID starting with 82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9 not found: ID does not exist" containerID="82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.907499 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9"} err="failed to get container status \"82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9\": rpc error: code = NotFound desc = could not find container \"82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9\": container with ID starting with 82cc873c89fc708f08841c277ab3892b4f5670875faa32768636870f72dd4ea9 not found: ID does not exist" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.907574 4865 scope.go:117] "RemoveContainer" containerID="f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6" Dec 05 06:10:31 crc kubenswrapper[4865]: E1205 06:10:31.907780 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6\": container with ID starting with f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6 not found: ID does not exist" containerID="f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.907959 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6"} err="failed to get container status \"f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6\": rpc error: code = NotFound desc = could not find container \"f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6\": container with ID starting with f6312a57db2b894b81b729f5afa5acae2c2b0adea00b6b315021ed13cf2e86f6 not found: ID does not exist" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.908028 4865 scope.go:117] "RemoveContainer" containerID="c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.925154 4865 scope.go:117] "RemoveContainer" containerID="9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.961194 4865 scope.go:117] "RemoveContainer" containerID="c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29" Dec 05 06:10:31 crc kubenswrapper[4865]: E1205 06:10:31.961728 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29\": container with ID starting with c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29 not found: ID does not exist" containerID="c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.961769 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29"} err="failed to get container status \"c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29\": rpc error: code = NotFound desc = could not find container \"c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29\": container with ID starting with c3a253d3a755477c9cb48b5ecc790cf8bee099ba677b79c89398c0b4da68bb29 not found: ID does not exist" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.961799 4865 scope.go:117] "RemoveContainer" containerID="9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8" Dec 05 06:10:31 crc kubenswrapper[4865]: E1205 06:10:31.962086 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8\": container with ID starting with 9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8 not found: ID does not exist" containerID="9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8" Dec 05 06:10:31 crc kubenswrapper[4865]: I1205 06:10:31.962110 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8"} err="failed to get container status \"9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8\": rpc error: code = NotFound desc = could not find container \"9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8\": container with ID starting with 9641c986667bddf044053c2415397ccd95879e5899fb4ca109cf1c42a6f072d8 not found: ID does not exist" Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.002266 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3df550-e8c3-4437-8299-794ff6295eed-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.098881 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-x8c88"] Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.105461 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-x8c88"] Dec 05 06:10:32 crc kubenswrapper[4865]: W1205 06:10:32.109132 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ebd484_c0dc_45cf_a057_46cb8f76f212.slice/crio-41c505eeab8d07c8f19e323cc220f5df17be1305e4172e060d82d4948f0d49f9 WatchSource:0}: Error finding container 41c505eeab8d07c8f19e323cc220f5df17be1305e4172e060d82d4948f0d49f9: Status 404 returned error can't find the container with id 41c505eeab8d07c8f19e323cc220f5df17be1305e4172e060d82d4948f0d49f9 Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.111344 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.791023 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7ebd484-c0dc-45cf-a057-46cb8f76f212","Type":"ContainerStarted","Data":"41c505eeab8d07c8f19e323cc220f5df17be1305e4172e060d82d4948f0d49f9"} Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.793773 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" event={"ID":"2c8cda95-9251-4c67-90f3-0cc090868b6f","Type":"ContainerStarted","Data":"c7ce465cce771b87eafdce55819bd0e77d29d2739c5278e527c3bc6ffb43fd2c"} Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.793871 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:32 crc kubenswrapper[4865]: I1205 06:10:32.826640 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" podStartSLOduration=3.826619832 podStartE2EDuration="3.826619832s" podCreationTimestamp="2025-12-05 06:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:32.818082891 +0000 UTC m=+1052.098094123" watchObservedRunningTime="2025-12-05 06:10:32.826619832 +0000 UTC m=+1052.106631054" Dec 05 06:10:33 crc kubenswrapper[4865]: I1205 06:10:33.021196 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273190c6-096d-4098-b426-7fa1854b09f8" path="/var/lib/kubelet/pods/273190c6-096d-4098-b426-7fa1854b09f8/volumes" Dec 05 06:10:33 crc kubenswrapper[4865]: I1205 06:10:33.022160 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" path="/var/lib/kubelet/pods/3c3df550-e8c3-4437-8299-794ff6295eed/volumes" Dec 05 06:10:33 crc kubenswrapper[4865]: I1205 06:10:33.810258 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7ebd484-c0dc-45cf-a057-46cb8f76f212","Type":"ContainerStarted","Data":"68c229f17b5ac6f99f9df34004bf9a428768732b121e8cb950d4961e42529954"} Dec 05 06:10:33 crc kubenswrapper[4865]: I1205 06:10:33.810598 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c7ebd484-c0dc-45cf-a057-46cb8f76f212","Type":"ContainerStarted","Data":"e7fc93b83c6ce33ce0d9e17016a394a32434c72f696bc978d23338137eac22e3"} Dec 05 06:10:33 crc kubenswrapper[4865]: I1205 06:10:33.827461 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 06:10:33 crc kubenswrapper[4865]: I1205 06:10:33.863288 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.739008544 podStartE2EDuration="2.863265741s" podCreationTimestamp="2025-12-05 06:10:31 +0000 UTC" firstStartedPulling="2025-12-05 06:10:32.112009475 +0000 UTC m=+1051.392020697" lastFinishedPulling="2025-12-05 06:10:33.236266672 +0000 UTC m=+1052.516277894" observedRunningTime="2025-12-05 06:10:33.848460662 +0000 UTC m=+1053.128471894" watchObservedRunningTime="2025-12-05 06:10:33.863265741 +0000 UTC m=+1053.143276963" Dec 05 06:10:38 crc kubenswrapper[4865]: I1205 06:10:38.517247 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 06:10:38 crc kubenswrapper[4865]: I1205 06:10:38.519013 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 06:10:38 crc kubenswrapper[4865]: I1205 06:10:38.605959 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 06:10:38 crc kubenswrapper[4865]: I1205 06:10:38.863250 4865 generic.go:334] "Generic (PLEG): container finished" podID="1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" containerID="14f3bc4ef563ead7fe047b98ed74265ffceb7b4171894452d833dd4aa680877d" exitCode=0 Dec 05 06:10:38 crc kubenswrapper[4865]: I1205 06:10:38.863292 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gx5lg" event={"ID":"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea","Type":"ContainerDied","Data":"14f3bc4ef563ead7fe047b98ed74265ffceb7b4171894452d833dd4aa680877d"} Dec 05 06:10:38 crc kubenswrapper[4865]: I1205 06:10:38.945581 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.384756 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.791694 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-kktpw"] Dec 05 06:10:39 crc kubenswrapper[4865]: E1205 06:10:39.792133 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273190c6-096d-4098-b426-7fa1854b09f8" containerName="init" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.792158 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="273190c6-096d-4098-b426-7fa1854b09f8" containerName="init" Dec 05 06:10:39 crc kubenswrapper[4865]: E1205 06:10:39.792176 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" containerName="init" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.792184 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" containerName="init" Dec 05 06:10:39 crc kubenswrapper[4865]: E1205 06:10:39.792199 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" containerName="dnsmasq-dns" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.792209 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" containerName="dnsmasq-dns" Dec 05 06:10:39 crc kubenswrapper[4865]: E1205 06:10:39.792242 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273190c6-096d-4098-b426-7fa1854b09f8" containerName="dnsmasq-dns" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.792250 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="273190c6-096d-4098-b426-7fa1854b09f8" containerName="dnsmasq-dns" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.792469 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="273190c6-096d-4098-b426-7fa1854b09f8" containerName="dnsmasq-dns" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.792490 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3df550-e8c3-4437-8299-794ff6295eed" containerName="dnsmasq-dns" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.793183 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.809714 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kktpw"] Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.856018 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-60a0-account-create-update-24m4m"] Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.857321 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.863402 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.873927 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-60a0-account-create-update-24m4m"] Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.932865 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h96nj\" (UniqueName: \"kubernetes.io/projected/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-kube-api-access-h96nj\") pod \"keystone-60a0-account-create-update-24m4m\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.933244 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18914495-0dfa-4528-ac93-942ccad6f5a3-operator-scripts\") pod \"keystone-db-create-kktpw\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.933387 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g94bd\" (UniqueName: \"kubernetes.io/projected/18914495-0dfa-4528-ac93-942ccad6f5a3-kube-api-access-g94bd\") pod \"keystone-db-create-kktpw\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:39 crc kubenswrapper[4865]: I1205 06:10:39.933416 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-operator-scripts\") pod \"keystone-60a0-account-create-update-24m4m\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.008896 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-w8nx8"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.010326 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.024301 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w8nx8"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.034513 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18914495-0dfa-4528-ac93-942ccad6f5a3-operator-scripts\") pod \"keystone-db-create-kktpw\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.036045 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g94bd\" (UniqueName: \"kubernetes.io/projected/18914495-0dfa-4528-ac93-942ccad6f5a3-kube-api-access-g94bd\") pod \"keystone-db-create-kktpw\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.036076 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-operator-scripts\") pod \"keystone-60a0-account-create-update-24m4m\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.036180 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h96nj\" (UniqueName: \"kubernetes.io/projected/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-kube-api-access-h96nj\") pod \"keystone-60a0-account-create-update-24m4m\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.035727 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18914495-0dfa-4528-ac93-942ccad6f5a3-operator-scripts\") pod \"keystone-db-create-kktpw\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.037598 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-operator-scripts\") pod \"keystone-60a0-account-create-update-24m4m\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.061431 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g94bd\" (UniqueName: \"kubernetes.io/projected/18914495-0dfa-4528-ac93-942ccad6f5a3-kube-api-access-g94bd\") pod \"keystone-db-create-kktpw\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.076852 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h96nj\" (UniqueName: \"kubernetes.io/projected/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-kube-api-access-h96nj\") pod \"keystone-60a0-account-create-update-24m4m\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.111339 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.123562 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.138384 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/295a1eb6-1b02-45f2-81ad-f5fae06d4146-operator-scripts\") pod \"placement-db-create-w8nx8\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.138471 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmth\" (UniqueName: \"kubernetes.io/projected/295a1eb6-1b02-45f2-81ad-f5fae06d4146-kube-api-access-vtmth\") pod \"placement-db-create-w8nx8\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.142599 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f6dc-account-create-update-k7dxc"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.144170 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.151448 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.189561 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f6dc-account-create-update-k7dxc"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.191736 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.241014 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtxgx\" (UniqueName: \"kubernetes.io/projected/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-kube-api-access-qtxgx\") pod \"placement-f6dc-account-create-update-k7dxc\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.241142 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/295a1eb6-1b02-45f2-81ad-f5fae06d4146-operator-scripts\") pod \"placement-db-create-w8nx8\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.241207 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-operator-scripts\") pod \"placement-f6dc-account-create-update-k7dxc\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.241238 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmth\" (UniqueName: \"kubernetes.io/projected/295a1eb6-1b02-45f2-81ad-f5fae06d4146-kube-api-access-vtmth\") pod \"placement-db-create-w8nx8\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.250354 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-cj64h"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.250886 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" containerName="dnsmasq-dns" containerID="cri-o://461eafee70f13ed02576ce30b870d8f6055f13b32a33d66325a1c7757824aed6" gracePeriod=10 Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.252360 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/295a1eb6-1b02-45f2-81ad-f5fae06d4146-operator-scripts\") pod \"placement-db-create-w8nx8\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.301040 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmth\" (UniqueName: \"kubernetes.io/projected/295a1eb6-1b02-45f2-81ad-f5fae06d4146-kube-api-access-vtmth\") pod \"placement-db-create-w8nx8\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.332182 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.351921 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-operator-scripts\") pod \"placement-f6dc-account-create-update-k7dxc\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.351993 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtxgx\" (UniqueName: \"kubernetes.io/projected/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-kube-api-access-qtxgx\") pod \"placement-f6dc-account-create-update-k7dxc\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.352643 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-operator-scripts\") pod \"placement-f6dc-account-create-update-k7dxc\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.375869 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtxgx\" (UniqueName: \"kubernetes.io/projected/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-kube-api-access-qtxgx\") pod \"placement-f6dc-account-create-update-k7dxc\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.465256 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.481275 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-thvvr"] Dec 05 06:10:40 crc kubenswrapper[4865]: E1205 06:10:40.481712 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" containerName="swift-ring-rebalance" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.481724 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" containerName="swift-ring-rebalance" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.481946 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" containerName="swift-ring-rebalance" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.482498 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.539679 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-thvvr"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.570692 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z7g2\" (UniqueName: \"kubernetes.io/projected/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-kube-api-access-7z7g2\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.570774 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-etc-swift\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.570858 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-dispersionconf\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.570965 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-combined-ca-bundle\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.571026 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-swiftconf\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.571085 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-scripts\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.571117 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-ring-data-devices\") pod \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\" (UID: \"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea\") " Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.571431 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6xj\" (UniqueName: \"kubernetes.io/projected/8059be38-adc8-49f5-96f5-f5144c4ac8ee-kube-api-access-tx6xj\") pod \"glance-db-create-thvvr\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.571606 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8059be38-adc8-49f5-96f5-f5144c4ac8ee-operator-scripts\") pod \"glance-db-create-thvvr\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.572242 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.572783 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.585526 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-56dth" podUID="30eebd2b-aed6-4866-bec4-da326d89821c" containerName="ovn-controller" probeResult="failure" output=< Dec 05 06:10:40 crc kubenswrapper[4865]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 06:10:40 crc kubenswrapper[4865]: > Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.586991 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-kube-api-access-7z7g2" (OuterVolumeSpecName: "kube-api-access-7z7g2") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "kube-api-access-7z7g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.599584 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-89fd-account-create-update-w9tdj"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.600579 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.615100 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.625089 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.643671 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.643785 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-89fd-account-create-update-w9tdj"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.656475 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.662467 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-scripts" (OuterVolumeSpecName: "scripts") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.672784 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6xj\" (UniqueName: \"kubernetes.io/projected/8059be38-adc8-49f5-96f5-f5144c4ac8ee-kube-api-access-tx6xj\") pod \"glance-db-create-thvvr\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.672953 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8059be38-adc8-49f5-96f5-f5144c4ac8ee-operator-scripts\") pod \"glance-db-create-thvvr\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.673044 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.673060 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.673072 4865 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.673083 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z7g2\" (UniqueName: \"kubernetes.io/projected/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-kube-api-access-7z7g2\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.673096 4865 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.673106 4865 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.677056 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea" (UID: "1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.677459 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8059be38-adc8-49f5-96f5-f5144c4ac8ee-operator-scripts\") pod \"glance-db-create-thvvr\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.729508 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6xj\" (UniqueName: \"kubernetes.io/projected/8059be38-adc8-49f5-96f5-f5144c4ac8ee-kube-api-access-tx6xj\") pod \"glance-db-create-thvvr\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.774986 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3ab494b-c8fa-42da-af71-d24aaaafe086-operator-scripts\") pod \"glance-89fd-account-create-update-w9tdj\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.775036 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkwcs\" (UniqueName: \"kubernetes.io/projected/c3ab494b-c8fa-42da-af71-d24aaaafe086-kube-api-access-qkwcs\") pod \"glance-89fd-account-create-update-w9tdj\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.775191 4865 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.810226 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-thvvr" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.882174 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3ab494b-c8fa-42da-af71-d24aaaafe086-operator-scripts\") pod \"glance-89fd-account-create-update-w9tdj\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.882216 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkwcs\" (UniqueName: \"kubernetes.io/projected/c3ab494b-c8fa-42da-af71-d24aaaafe086-kube-api-access-qkwcs\") pod \"glance-89fd-account-create-update-w9tdj\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.883149 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3ab494b-c8fa-42da-af71-d24aaaafe086-operator-scripts\") pod \"glance-89fd-account-create-update-w9tdj\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.916188 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkwcs\" (UniqueName: \"kubernetes.io/projected/c3ab494b-c8fa-42da-af71-d24aaaafe086-kube-api-access-qkwcs\") pod \"glance-89fd-account-create-update-w9tdj\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.925490 4865 generic.go:334] "Generic (PLEG): container finished" podID="99c0930b-ba5f-45ed-a20a-16346e193307" containerID="461eafee70f13ed02576ce30b870d8f6055f13b32a33d66325a1c7757824aed6" exitCode=0 Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.925612 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" event={"ID":"99c0930b-ba5f-45ed-a20a-16346e193307","Type":"ContainerDied","Data":"461eafee70f13ed02576ce30b870d8f6055f13b32a33d66325a1c7757824aed6"} Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.931459 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-60a0-account-create-update-24m4m"] Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.967515 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.968283 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gx5lg" event={"ID":"1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea","Type":"ContainerDied","Data":"6cdd78b0b2433e9ffdc7ca5f5e04405cdf7e3f6e798f02aa03807ea34c5d384d"} Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.968536 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cdd78b0b2433e9ffdc7ca5f5e04405cdf7e3f6e798f02aa03807ea34c5d384d" Dec 05 06:10:40 crc kubenswrapper[4865]: I1205 06:10:40.968680 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx5lg" Dec 05 06:10:40 crc kubenswrapper[4865]: W1205 06:10:40.983399 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d4fad1c_f15e_4d44_a71c_24d196f3c8fe.slice/crio-0f19adeddd589b5ab21939006748a45619dc05a952ba9cf2e7f73515f0e22509 WatchSource:0}: Error finding container 0f19adeddd589b5ab21939006748a45619dc05a952ba9cf2e7f73515f0e22509: Status 404 returned error can't find the container with id 0f19adeddd589b5ab21939006748a45619dc05a952ba9cf2e7f73515f0e22509 Dec 05 06:10:41 crc kubenswrapper[4865]: W1205 06:10:41.050150 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18914495_0dfa_4528_ac93_942ccad6f5a3.slice/crio-bfc26553dcfbbd62e150d2ec80ce289a29cab9ecb1b905d5b769f55bf5cb3156 WatchSource:0}: Error finding container bfc26553dcfbbd62e150d2ec80ce289a29cab9ecb1b905d5b769f55bf5cb3156: Status 404 returned error can't find the container with id bfc26553dcfbbd62e150d2ec80ce289a29cab9ecb1b905d5b769f55bf5cb3156 Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.059219 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kktpw"] Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.060894 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.189460 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qshnb\" (UniqueName: \"kubernetes.io/projected/99c0930b-ba5f-45ed-a20a-16346e193307-kube-api-access-qshnb\") pod \"99c0930b-ba5f-45ed-a20a-16346e193307\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.189523 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-dns-svc\") pod \"99c0930b-ba5f-45ed-a20a-16346e193307\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.189622 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-ovsdbserver-sb\") pod \"99c0930b-ba5f-45ed-a20a-16346e193307\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.189657 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-config\") pod \"99c0930b-ba5f-45ed-a20a-16346e193307\" (UID: \"99c0930b-ba5f-45ed-a20a-16346e193307\") " Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.204472 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c0930b-ba5f-45ed-a20a-16346e193307-kube-api-access-qshnb" (OuterVolumeSpecName: "kube-api-access-qshnb") pod "99c0930b-ba5f-45ed-a20a-16346e193307" (UID: "99c0930b-ba5f-45ed-a20a-16346e193307"). InnerVolumeSpecName "kube-api-access-qshnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.214153 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w8nx8"] Dec 05 06:10:41 crc kubenswrapper[4865]: W1205 06:10:41.214430 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295a1eb6_1b02_45f2_81ad_f5fae06d4146.slice/crio-c19386a46b63e158927d700d6040b7d42fc92370c1324c1f08ba3996a21d2db8 WatchSource:0}: Error finding container c19386a46b63e158927d700d6040b7d42fc92370c1324c1f08ba3996a21d2db8: Status 404 returned error can't find the container with id c19386a46b63e158927d700d6040b7d42fc92370c1324c1f08ba3996a21d2db8 Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.278013 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-config" (OuterVolumeSpecName: "config") pod "99c0930b-ba5f-45ed-a20a-16346e193307" (UID: "99c0930b-ba5f-45ed-a20a-16346e193307"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.279780 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99c0930b-ba5f-45ed-a20a-16346e193307" (UID: "99c0930b-ba5f-45ed-a20a-16346e193307"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.280742 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99c0930b-ba5f-45ed-a20a-16346e193307" (UID: "99c0930b-ba5f-45ed-a20a-16346e193307"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.291551 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.291603 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qshnb\" (UniqueName: \"kubernetes.io/projected/99c0930b-ba5f-45ed-a20a-16346e193307-kube-api-access-qshnb\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.291612 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.291620 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c0930b-ba5f-45ed-a20a-16346e193307-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.359363 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f6dc-account-create-update-k7dxc"] Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.494283 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-thvvr"] Dec 05 06:10:41 crc kubenswrapper[4865]: W1205 06:10:41.506194 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8059be38_adc8_49f5_96f5_f5144c4ac8ee.slice/crio-266e760a6744fd42082c4a5affa29061632ef07cf0834747075539dbd4c48a7e WatchSource:0}: Error finding container 266e760a6744fd42082c4a5affa29061632ef07cf0834747075539dbd4c48a7e: Status 404 returned error can't find the container with id 266e760a6744fd42082c4a5affa29061632ef07cf0834747075539dbd4c48a7e Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.606525 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-89fd-account-create-update-w9tdj"] Dec 05 06:10:41 crc kubenswrapper[4865]: W1205 06:10:41.613969 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ab494b_c8fa_42da_af71_d24aaaafe086.slice/crio-3efae1d5824d3e38b467550fc4bc9a39fbfa8a3ebc54be0bdb091df6fc727303 WatchSource:0}: Error finding container 3efae1d5824d3e38b467550fc4bc9a39fbfa8a3ebc54be0bdb091df6fc727303: Status 404 returned error can't find the container with id 3efae1d5824d3e38b467550fc4bc9a39fbfa8a3ebc54be0bdb091df6fc727303 Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.983956 4865 generic.go:334] "Generic (PLEG): container finished" podID="18914495-0dfa-4528-ac93-942ccad6f5a3" containerID="9284e885fd5bc8313d85d153dfeab728819bf379c36989710816d5947b445b0a" exitCode=0 Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.984027 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kktpw" event={"ID":"18914495-0dfa-4528-ac93-942ccad6f5a3","Type":"ContainerDied","Data":"9284e885fd5bc8313d85d153dfeab728819bf379c36989710816d5947b445b0a"} Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.984231 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kktpw" event={"ID":"18914495-0dfa-4528-ac93-942ccad6f5a3","Type":"ContainerStarted","Data":"bfc26553dcfbbd62e150d2ec80ce289a29cab9ecb1b905d5b769f55bf5cb3156"} Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.989917 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" event={"ID":"99c0930b-ba5f-45ed-a20a-16346e193307","Type":"ContainerDied","Data":"81144be3f22a20e103a202245b0089aa5c7e12b60ac75b733f963307267507d6"} Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.990158 4865 scope.go:117] "RemoveContainer" containerID="461eafee70f13ed02576ce30b870d8f6055f13b32a33d66325a1c7757824aed6" Dec 05 06:10:41 crc kubenswrapper[4865]: I1205 06:10:41.989946 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-cj64h" Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.003808 4865 generic.go:334] "Generic (PLEG): container finished" podID="6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" containerID="e6dcc2573a6d56fc745e9ae9f9b1e60b32677fa1ced22780bfc93d86ee54d7c7" exitCode=0 Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.004095 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-60a0-account-create-update-24m4m" event={"ID":"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe","Type":"ContainerDied","Data":"e6dcc2573a6d56fc745e9ae9f9b1e60b32677fa1ced22780bfc93d86ee54d7c7"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.004206 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-60a0-account-create-update-24m4m" event={"ID":"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe","Type":"ContainerStarted","Data":"0f19adeddd589b5ab21939006748a45619dc05a952ba9cf2e7f73515f0e22509"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.008481 4865 generic.go:334] "Generic (PLEG): container finished" podID="295a1eb6-1b02-45f2-81ad-f5fae06d4146" containerID="ba16dd313ee63c2b879a00bf928c24910a963011ac18e2191dbd94c0823e66d9" exitCode=0 Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.008578 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w8nx8" event={"ID":"295a1eb6-1b02-45f2-81ad-f5fae06d4146","Type":"ContainerDied","Data":"ba16dd313ee63c2b879a00bf928c24910a963011ac18e2191dbd94c0823e66d9"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.008600 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w8nx8" event={"ID":"295a1eb6-1b02-45f2-81ad-f5fae06d4146","Type":"ContainerStarted","Data":"c19386a46b63e158927d700d6040b7d42fc92370c1324c1f08ba3996a21d2db8"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.010717 4865 generic.go:334] "Generic (PLEG): container finished" podID="8059be38-adc8-49f5-96f5-f5144c4ac8ee" containerID="72c6191d824e26de87e9c5027b3f37ce377b6322b626e2cf4c741959ccbe8da9" exitCode=0 Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.010756 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-thvvr" event={"ID":"8059be38-adc8-49f5-96f5-f5144c4ac8ee","Type":"ContainerDied","Data":"72c6191d824e26de87e9c5027b3f37ce377b6322b626e2cf4c741959ccbe8da9"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.010770 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-thvvr" event={"ID":"8059be38-adc8-49f5-96f5-f5144c4ac8ee","Type":"ContainerStarted","Data":"266e760a6744fd42082c4a5affa29061632ef07cf0834747075539dbd4c48a7e"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.012208 4865 generic.go:334] "Generic (PLEG): container finished" podID="32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" containerID="49ec4a20d00a1d81303f01e61c881bb5a94b7b365bf85e056ff3ec74e256f265" exitCode=0 Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.012242 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f6dc-account-create-update-k7dxc" event={"ID":"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37","Type":"ContainerDied","Data":"49ec4a20d00a1d81303f01e61c881bb5a94b7b365bf85e056ff3ec74e256f265"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.012255 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f6dc-account-create-update-k7dxc" event={"ID":"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37","Type":"ContainerStarted","Data":"ab80e979a769053a45ac5ee0e49a069224165058976f3fce8a04ff6cb58f52c0"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.015158 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-89fd-account-create-update-w9tdj" event={"ID":"c3ab494b-c8fa-42da-af71-d24aaaafe086","Type":"ContainerStarted","Data":"d7b0c16db59b802d99b2c3229577baf8f2f85618e20d780445d3e57a955896af"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.015367 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-89fd-account-create-update-w9tdj" event={"ID":"c3ab494b-c8fa-42da-af71-d24aaaafe086","Type":"ContainerStarted","Data":"3efae1d5824d3e38b467550fc4bc9a39fbfa8a3ebc54be0bdb091df6fc727303"} Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.078883 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-89fd-account-create-update-w9tdj" podStartSLOduration=2.078855265 podStartE2EDuration="2.078855265s" podCreationTimestamp="2025-12-05 06:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:42.07475783 +0000 UTC m=+1061.354769062" watchObservedRunningTime="2025-12-05 06:10:42.078855265 +0000 UTC m=+1061.358866497" Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.112100 4865 scope.go:117] "RemoveContainer" containerID="a12f75413fb86a56001913166df08b000e51cd97f02c5cf3e2621b1fa9181091" Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.126643 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-cj64h"] Dec 05 06:10:42 crc kubenswrapper[4865]: I1205 06:10:42.131637 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-cj64h"] Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.022246 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" path="/var/lib/kubelet/pods/99c0930b-ba5f-45ed-a20a-16346e193307/volumes" Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.027364 4865 generic.go:334] "Generic (PLEG): container finished" podID="c3ab494b-c8fa-42da-af71-d24aaaafe086" containerID="d7b0c16db59b802d99b2c3229577baf8f2f85618e20d780445d3e57a955896af" exitCode=0 Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.027451 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-89fd-account-create-update-w9tdj" event={"ID":"c3ab494b-c8fa-42da-af71-d24aaaafe086","Type":"ContainerDied","Data":"d7b0c16db59b802d99b2c3229577baf8f2f85618e20d780445d3e57a955896af"} Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.487270 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.641277 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-operator-scripts\") pod \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.641736 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h96nj\" (UniqueName: \"kubernetes.io/projected/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-kube-api-access-h96nj\") pod \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\" (UID: \"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe\") " Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.642301 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" (UID: "6d4fad1c-f15e-4d44-a71c-24d196f3c8fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.651662 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-kube-api-access-h96nj" (OuterVolumeSpecName: "kube-api-access-h96nj") pod "6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" (UID: "6d4fad1c-f15e-4d44-a71c-24d196f3c8fe"). InnerVolumeSpecName "kube-api-access-h96nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.744138 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:43 crc kubenswrapper[4865]: I1205 06:10:43.744192 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h96nj\" (UniqueName: \"kubernetes.io/projected/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe-kube-api-access-h96nj\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.041491 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-60a0-account-create-update-24m4m" event={"ID":"6d4fad1c-f15e-4d44-a71c-24d196f3c8fe","Type":"ContainerDied","Data":"0f19adeddd589b5ab21939006748a45619dc05a952ba9cf2e7f73515f0e22509"} Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.041786 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f19adeddd589b5ab21939006748a45619dc05a952ba9cf2e7f73515f0e22509" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.041517 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-60a0-account-create-update-24m4m" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.787171 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.792901 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-thvvr" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.806426 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.827993 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.868414 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8059be38-adc8-49f5-96f5-f5144c4ac8ee-operator-scripts\") pod \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.868458 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtxgx\" (UniqueName: \"kubernetes.io/projected/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-kube-api-access-qtxgx\") pod \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.868562 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18914495-0dfa-4528-ac93-942ccad6f5a3-operator-scripts\") pod \"18914495-0dfa-4528-ac93-942ccad6f5a3\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.868584 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx6xj\" (UniqueName: \"kubernetes.io/projected/8059be38-adc8-49f5-96f5-f5144c4ac8ee-kube-api-access-tx6xj\") pod \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\" (UID: \"8059be38-adc8-49f5-96f5-f5144c4ac8ee\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.868617 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g94bd\" (UniqueName: \"kubernetes.io/projected/18914495-0dfa-4528-ac93-942ccad6f5a3-kube-api-access-g94bd\") pod \"18914495-0dfa-4528-ac93-942ccad6f5a3\" (UID: \"18914495-0dfa-4528-ac93-942ccad6f5a3\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.868642 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-operator-scripts\") pod \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\" (UID: \"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.870017 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8059be38-adc8-49f5-96f5-f5144c4ac8ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8059be38-adc8-49f5-96f5-f5144c4ac8ee" (UID: "8059be38-adc8-49f5-96f5-f5144c4ac8ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.870377 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18914495-0dfa-4528-ac93-942ccad6f5a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18914495-0dfa-4528-ac93-942ccad6f5a3" (UID: "18914495-0dfa-4528-ac93-942ccad6f5a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.870507 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" (UID: "32904ad2-4fdc-4dc2-9b0e-2726c0c30b37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.886302 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18914495-0dfa-4528-ac93-942ccad6f5a3-kube-api-access-g94bd" (OuterVolumeSpecName: "kube-api-access-g94bd") pod "18914495-0dfa-4528-ac93-942ccad6f5a3" (UID: "18914495-0dfa-4528-ac93-942ccad6f5a3"). InnerVolumeSpecName "kube-api-access-g94bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.886422 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8059be38-adc8-49f5-96f5-f5144c4ac8ee-kube-api-access-tx6xj" (OuterVolumeSpecName: "kube-api-access-tx6xj") pod "8059be38-adc8-49f5-96f5-f5144c4ac8ee" (UID: "8059be38-adc8-49f5-96f5-f5144c4ac8ee"). InnerVolumeSpecName "kube-api-access-tx6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.886488 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-kube-api-access-qtxgx" (OuterVolumeSpecName: "kube-api-access-qtxgx") pod "32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" (UID: "32904ad2-4fdc-4dc2-9b0e-2726c0c30b37"). InnerVolumeSpecName "kube-api-access-qtxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.927876 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.970386 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/295a1eb6-1b02-45f2-81ad-f5fae06d4146-operator-scripts\") pod \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.970466 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtmth\" (UniqueName: \"kubernetes.io/projected/295a1eb6-1b02-45f2-81ad-f5fae06d4146-kube-api-access-vtmth\") pod \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\" (UID: \"295a1eb6-1b02-45f2-81ad-f5fae06d4146\") " Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.970898 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295a1eb6-1b02-45f2-81ad-f5fae06d4146-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "295a1eb6-1b02-45f2-81ad-f5fae06d4146" (UID: "295a1eb6-1b02-45f2-81ad-f5fae06d4146"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.970979 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18914495-0dfa-4528-ac93-942ccad6f5a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.970992 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx6xj\" (UniqueName: \"kubernetes.io/projected/8059be38-adc8-49f5-96f5-f5144c4ac8ee-kube-api-access-tx6xj\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.971003 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g94bd\" (UniqueName: \"kubernetes.io/projected/18914495-0dfa-4528-ac93-942ccad6f5a3-kube-api-access-g94bd\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.971011 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.971021 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8059be38-adc8-49f5-96f5-f5144c4ac8ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.971030 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtxgx\" (UniqueName: \"kubernetes.io/projected/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37-kube-api-access-qtxgx\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:44 crc kubenswrapper[4865]: I1205 06:10:44.973916 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295a1eb6-1b02-45f2-81ad-f5fae06d4146-kube-api-access-vtmth" (OuterVolumeSpecName: "kube-api-access-vtmth") pod "295a1eb6-1b02-45f2-81ad-f5fae06d4146" (UID: "295a1eb6-1b02-45f2-81ad-f5fae06d4146"). InnerVolumeSpecName "kube-api-access-vtmth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.053997 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w8nx8" event={"ID":"295a1eb6-1b02-45f2-81ad-f5fae06d4146","Type":"ContainerDied","Data":"c19386a46b63e158927d700d6040b7d42fc92370c1324c1f08ba3996a21d2db8"} Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.054064 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19386a46b63e158927d700d6040b7d42fc92370c1324c1f08ba3996a21d2db8" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.054141 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w8nx8" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.056419 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-thvvr" event={"ID":"8059be38-adc8-49f5-96f5-f5144c4ac8ee","Type":"ContainerDied","Data":"266e760a6744fd42082c4a5affa29061632ef07cf0834747075539dbd4c48a7e"} Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.056444 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="266e760a6744fd42082c4a5affa29061632ef07cf0834747075539dbd4c48a7e" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.056578 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-thvvr" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.058249 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f6dc-account-create-update-k7dxc" event={"ID":"32904ad2-4fdc-4dc2-9b0e-2726c0c30b37","Type":"ContainerDied","Data":"ab80e979a769053a45ac5ee0e49a069224165058976f3fce8a04ff6cb58f52c0"} Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.058343 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab80e979a769053a45ac5ee0e49a069224165058976f3fce8a04ff6cb58f52c0" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.058452 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f6dc-account-create-update-k7dxc" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.061403 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-89fd-account-create-update-w9tdj" event={"ID":"c3ab494b-c8fa-42da-af71-d24aaaafe086","Type":"ContainerDied","Data":"3efae1d5824d3e38b467550fc4bc9a39fbfa8a3ebc54be0bdb091df6fc727303"} Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.061512 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3efae1d5824d3e38b467550fc4bc9a39fbfa8a3ebc54be0bdb091df6fc727303" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.061604 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-89fd-account-create-update-w9tdj" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.063543 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kktpw" event={"ID":"18914495-0dfa-4528-ac93-942ccad6f5a3","Type":"ContainerDied","Data":"bfc26553dcfbbd62e150d2ec80ce289a29cab9ecb1b905d5b769f55bf5cb3156"} Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.063582 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc26553dcfbbd62e150d2ec80ce289a29cab9ecb1b905d5b769f55bf5cb3156" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.063646 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kktpw" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.071709 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3ab494b-c8fa-42da-af71-d24aaaafe086-operator-scripts\") pod \"c3ab494b-c8fa-42da-af71-d24aaaafe086\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.071934 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkwcs\" (UniqueName: \"kubernetes.io/projected/c3ab494b-c8fa-42da-af71-d24aaaafe086-kube-api-access-qkwcs\") pod \"c3ab494b-c8fa-42da-af71-d24aaaafe086\" (UID: \"c3ab494b-c8fa-42da-af71-d24aaaafe086\") " Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.072296 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ab494b-c8fa-42da-af71-d24aaaafe086-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3ab494b-c8fa-42da-af71-d24aaaafe086" (UID: "c3ab494b-c8fa-42da-af71-d24aaaafe086"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.074804 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3ab494b-c8fa-42da-af71-d24aaaafe086-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.074851 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/295a1eb6-1b02-45f2-81ad-f5fae06d4146-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.074864 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtmth\" (UniqueName: \"kubernetes.io/projected/295a1eb6-1b02-45f2-81ad-f5fae06d4146-kube-api-access-vtmth\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.082978 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ab494b-c8fa-42da-af71-d24aaaafe086-kube-api-access-qkwcs" (OuterVolumeSpecName: "kube-api-access-qkwcs") pod "c3ab494b-c8fa-42da-af71-d24aaaafe086" (UID: "c3ab494b-c8fa-42da-af71-d24aaaafe086"). InnerVolumeSpecName "kube-api-access-qkwcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.176269 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkwcs\" (UniqueName: \"kubernetes.io/projected/c3ab494b-c8fa-42da-af71-d24aaaafe086-kube-api-access-qkwcs\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.473024 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.476096 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-56dth" podUID="30eebd2b-aed6-4866-bec4-da326d89821c" containerName="ovn-controller" probeResult="failure" output=< Dec 05 06:10:45 crc kubenswrapper[4865]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 06:10:45 crc kubenswrapper[4865]: > Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.477452 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jbvmz" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.714516 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56dth-config-c28v4"] Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.714901 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" containerName="dnsmasq-dns" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.714916 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" containerName="dnsmasq-dns" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.714937 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ab494b-c8fa-42da-af71-d24aaaafe086" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.714945 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ab494b-c8fa-42da-af71-d24aaaafe086" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.714957 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295a1eb6-1b02-45f2-81ad-f5fae06d4146" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.714966 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="295a1eb6-1b02-45f2-81ad-f5fae06d4146" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.714980 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.714989 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.715011 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.715020 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.715038 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18914495-0dfa-4528-ac93-942ccad6f5a3" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.715046 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="18914495-0dfa-4528-ac93-942ccad6f5a3" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.720310 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8059be38-adc8-49f5-96f5-f5144c4ac8ee" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720346 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8059be38-adc8-49f5-96f5-f5144c4ac8ee" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: E1205 06:10:45.720368 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" containerName="init" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720376 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" containerName="init" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720701 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720718 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c0930b-ba5f-45ed-a20a-16346e193307" containerName="dnsmasq-dns" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720725 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="295a1eb6-1b02-45f2-81ad-f5fae06d4146" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720739 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ab494b-c8fa-42da-af71-d24aaaafe086" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720755 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8059be38-adc8-49f5-96f5-f5144c4ac8ee" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720763 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" containerName="mariadb-account-create-update" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.720773 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="18914495-0dfa-4528-ac93-942ccad6f5a3" containerName="mariadb-database-create" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.721376 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.726992 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.729263 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56dth-config-c28v4"] Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.789185 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpzmt\" (UniqueName: \"kubernetes.io/projected/0b0192d1-09f9-4288-9974-4722499aa70c-kube-api-access-zpzmt\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.789349 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run-ovn\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.789391 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.789428 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-additional-scripts\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.789472 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-scripts\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.789500 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-log-ovn\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.890921 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run-ovn\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.890995 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891031 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-additional-scripts\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891070 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-scripts\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891095 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-log-ovn\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891240 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run-ovn\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891256 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpzmt\" (UniqueName: \"kubernetes.io/projected/0b0192d1-09f9-4288-9974-4722499aa70c-kube-api-access-zpzmt\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891346 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-log-ovn\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891267 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.891912 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-additional-scripts\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.892949 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-scripts\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:45 crc kubenswrapper[4865]: I1205 06:10:45.926532 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpzmt\" (UniqueName: \"kubernetes.io/projected/0b0192d1-09f9-4288-9974-4722499aa70c-kube-api-access-zpzmt\") pod \"ovn-controller-56dth-config-c28v4\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.036434 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.197168 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.202608 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f-etc-swift\") pod \"swift-storage-0\" (UID: \"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f\") " pod="openstack/swift-storage-0" Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.326935 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.436752 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56dth-config-c28v4"] Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.678593 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 06:10:46 crc kubenswrapper[4865]: I1205 06:10:46.915165 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 06:10:47 crc kubenswrapper[4865]: I1205 06:10:47.078127 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"57c9bf33c68972a8ddaf51037b5b8a1ea814251bb27d4100b0af299cbd703005"} Dec 05 06:10:47 crc kubenswrapper[4865]: I1205 06:10:47.079353 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56dth-config-c28v4" event={"ID":"0b0192d1-09f9-4288-9974-4722499aa70c","Type":"ContainerStarted","Data":"d8a5c42fa71fcb839cb32be74a4860d9c055f794d10b51607b3f1c0c43ce20d6"} Dec 05 06:10:47 crc kubenswrapper[4865]: I1205 06:10:47.079397 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56dth-config-c28v4" event={"ID":"0b0192d1-09f9-4288-9974-4722499aa70c","Type":"ContainerStarted","Data":"c4d29a04f07603c1e7e45e0d44f64576a3909260d12fb87bf48ae98060736c23"} Dec 05 06:10:47 crc kubenswrapper[4865]: I1205 06:10:47.101890 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-56dth-config-c28v4" podStartSLOduration=2.101867455 podStartE2EDuration="2.101867455s" podCreationTimestamp="2025-12-05 06:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:47.095768572 +0000 UTC m=+1066.375779814" watchObservedRunningTime="2025-12-05 06:10:47.101867455 +0000 UTC m=+1066.381878677" Dec 05 06:10:48 crc kubenswrapper[4865]: I1205 06:10:48.087382 4865 generic.go:334] "Generic (PLEG): container finished" podID="0b0192d1-09f9-4288-9974-4722499aa70c" containerID="d8a5c42fa71fcb839cb32be74a4860d9c055f794d10b51607b3f1c0c43ce20d6" exitCode=0 Dec 05 06:10:48 crc kubenswrapper[4865]: I1205 06:10:48.087433 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56dth-config-c28v4" event={"ID":"0b0192d1-09f9-4288-9974-4722499aa70c","Type":"ContainerDied","Data":"d8a5c42fa71fcb839cb32be74a4860d9c055f794d10b51607b3f1c0c43ce20d6"} Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.152859 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"cc59b372c53d64394568fcb344a4f75869ec070bbe652276eb38e6428e4e8773"} Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.153101 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"b185b11c496dad2eba0c4f9bf82592a7855500e6761262b15766e00cb80154ae"} Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.153118 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"7e30576b90112d6a0ee7977c9e30ef367b4945dd8a266e34e0a3b29f2dc102bd"} Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.598839 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658651 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-log-ovn\") pod \"0b0192d1-09f9-4288-9974-4722499aa70c\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658717 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run\") pod \"0b0192d1-09f9-4288-9974-4722499aa70c\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658811 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run-ovn\") pod \"0b0192d1-09f9-4288-9974-4722499aa70c\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658802 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0b0192d1-09f9-4288-9974-4722499aa70c" (UID: "0b0192d1-09f9-4288-9974-4722499aa70c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658878 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-additional-scripts\") pod \"0b0192d1-09f9-4288-9974-4722499aa70c\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658885 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run" (OuterVolumeSpecName: "var-run") pod "0b0192d1-09f9-4288-9974-4722499aa70c" (UID: "0b0192d1-09f9-4288-9974-4722499aa70c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658927 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0b0192d1-09f9-4288-9974-4722499aa70c" (UID: "0b0192d1-09f9-4288-9974-4722499aa70c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658968 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-scripts\") pod \"0b0192d1-09f9-4288-9974-4722499aa70c\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.658998 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpzmt\" (UniqueName: \"kubernetes.io/projected/0b0192d1-09f9-4288-9974-4722499aa70c-kube-api-access-zpzmt\") pod \"0b0192d1-09f9-4288-9974-4722499aa70c\" (UID: \"0b0192d1-09f9-4288-9974-4722499aa70c\") " Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.659330 4865 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.659342 4865 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.659350 4865 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0b0192d1-09f9-4288-9974-4722499aa70c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.659696 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0b0192d1-09f9-4288-9974-4722499aa70c" (UID: "0b0192d1-09f9-4288-9974-4722499aa70c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.660109 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-scripts" (OuterVolumeSpecName: "scripts") pod "0b0192d1-09f9-4288-9974-4722499aa70c" (UID: "0b0192d1-09f9-4288-9974-4722499aa70c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.664976 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0192d1-09f9-4288-9974-4722499aa70c-kube-api-access-zpzmt" (OuterVolumeSpecName: "kube-api-access-zpzmt") pod "0b0192d1-09f9-4288-9974-4722499aa70c" (UID: "0b0192d1-09f9-4288-9974-4722499aa70c"). InnerVolumeSpecName "kube-api-access-zpzmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.761120 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpzmt\" (UniqueName: \"kubernetes.io/projected/0b0192d1-09f9-4288-9974-4722499aa70c-kube-api-access-zpzmt\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.761158 4865 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:49 crc kubenswrapper[4865]: I1205 06:10:49.761168 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b0192d1-09f9-4288-9974-4722499aa70c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.167544 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"60f186b84deb392f3c2f15ae552abb726385412737f7d8deb4db86cc42fc73b0"} Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.170997 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56dth-config-c28v4" event={"ID":"0b0192d1-09f9-4288-9974-4722499aa70c","Type":"ContainerDied","Data":"c4d29a04f07603c1e7e45e0d44f64576a3909260d12fb87bf48ae98060736c23"} Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.171041 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d29a04f07603c1e7e45e0d44f64576a3909260d12fb87bf48ae98060736c23" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.171055 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56dth-config-c28v4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.223075 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-56dth-config-c28v4"] Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.231262 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-56dth-config-c28v4"] Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.448338 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-56dth" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.688529 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dj2h4"] Dec 05 06:10:50 crc kubenswrapper[4865]: E1205 06:10:50.692613 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0192d1-09f9-4288-9974-4722499aa70c" containerName="ovn-config" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.692902 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0192d1-09f9-4288-9974-4722499aa70c" containerName="ovn-config" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.693219 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b0192d1-09f9-4288-9974-4722499aa70c" containerName="ovn-config" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.693913 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.696784 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.697068 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7rxpt" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.713288 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dj2h4"] Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.774818 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5m54\" (UniqueName: \"kubernetes.io/projected/e6d4176e-7d68-48c6-9e9a-c507558508ab-kube-api-access-f5m54\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.774898 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-db-sync-config-data\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.774938 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-config-data\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.774964 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-combined-ca-bundle\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.877701 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5m54\" (UniqueName: \"kubernetes.io/projected/e6d4176e-7d68-48c6-9e9a-c507558508ab-kube-api-access-f5m54\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.878032 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-db-sync-config-data\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.878072 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-config-data\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.878097 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-combined-ca-bundle\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.884661 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-config-data\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.889339 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-db-sync-config-data\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.895294 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-combined-ca-bundle\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:50 crc kubenswrapper[4865]: I1205 06:10:50.895985 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5m54\" (UniqueName: \"kubernetes.io/projected/e6d4176e-7d68-48c6-9e9a-c507558508ab-kube-api-access-f5m54\") pod \"glance-db-sync-dj2h4\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:51 crc kubenswrapper[4865]: I1205 06:10:51.018367 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b0192d1-09f9-4288-9974-4722499aa70c" path="/var/lib/kubelet/pods/0b0192d1-09f9-4288-9974-4722499aa70c/volumes" Dec 05 06:10:51 crc kubenswrapper[4865]: I1205 06:10:51.047361 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dj2h4" Dec 05 06:10:51 crc kubenswrapper[4865]: I1205 06:10:51.182735 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"0b17de92ff9ec9d776a1d867d09d85180cd5fc4556551aa3b9fe0699ce533f24"} Dec 05 06:10:51 crc kubenswrapper[4865]: I1205 06:10:51.182776 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"7a3569b4ec88b97272ae24008bcf23cb95cc9d485fa81ccdb32fc73d6e062337"} Dec 05 06:10:51 crc kubenswrapper[4865]: I1205 06:10:51.785644 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dj2h4"] Dec 05 06:10:52 crc kubenswrapper[4865]: I1205 06:10:52.202679 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"7cea0520568c8cc0acffbdece97eae82e76a112b09d4ce7e152b2bc053ddee1a"} Dec 05 06:10:52 crc kubenswrapper[4865]: I1205 06:10:52.202734 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"360b6b2791c1530db9bf571a06080581c540d9d1bf0c4980bb73aed961b909ba"} Dec 05 06:10:52 crc kubenswrapper[4865]: I1205 06:10:52.204680 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dj2h4" event={"ID":"e6d4176e-7d68-48c6-9e9a-c507558508ab","Type":"ContainerStarted","Data":"ec18096ba96c7cb97bcfc45e32e2f816423877dfe5e6a3cb9d6718981bc3d352"} Dec 05 06:10:53 crc kubenswrapper[4865]: I1205 06:10:53.218343 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"5aa528c7f04ebfd590b290d243910a419d3f764a4d81a912337e3736a695da4f"} Dec 05 06:10:53 crc kubenswrapper[4865]: I1205 06:10:53.218594 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"7c8ff3e39a4a6de303d08477462d5c6cf07205a5c37dde2249dd325200866dfc"} Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.231199 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"e06bcffacd9414199b6a184c256086f3f5f08a47568e5cd1b015a39585be0f0a"} Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.231615 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"2dd5a9e32d5506e3565410222c5adb529c9dcb496e522b6209e8a7ced20dd0fc"} Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.231639 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"7be4969662f42ae7aff2e9511befbffc26e4fda3ed10c7b2de614453be316c06"} Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.231652 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"a475db1723a0617491cdf9a3bc630989f42161f14ca06a757463124bdeefae3e"} Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.231663 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f","Type":"ContainerStarted","Data":"9c777cf2df925eeae99cf1854855b31324e3de8b2f1ba20a6603d26f120bd940"} Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.275636 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.474108539 podStartE2EDuration="41.275605923s" podCreationTimestamp="2025-12-05 06:10:13 +0000 UTC" firstStartedPulling="2025-12-05 06:10:46.915500083 +0000 UTC m=+1066.195511305" lastFinishedPulling="2025-12-05 06:10:52.716997457 +0000 UTC m=+1071.997008689" observedRunningTime="2025-12-05 06:10:54.274341087 +0000 UTC m=+1073.554352309" watchObservedRunningTime="2025-12-05 06:10:54.275605923 +0000 UTC m=+1073.555617145" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.570527 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qxf4f"] Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.571900 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.575057 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.585449 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qxf4f"] Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.664100 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-config\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.664233 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.664278 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.664375 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgv6\" (UniqueName: \"kubernetes.io/projected/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-kube-api-access-rdgv6\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.664445 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.664669 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.766612 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgv6\" (UniqueName: \"kubernetes.io/projected/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-kube-api-access-rdgv6\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.766672 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.766916 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.766991 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-config\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.767013 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.767032 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.768408 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-config\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.768611 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.768653 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.768988 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.769220 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.791696 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgv6\" (UniqueName: \"kubernetes.io/projected/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-kube-api-access-rdgv6\") pod \"dnsmasq-dns-6d5b6d6b67-qxf4f\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:54 crc kubenswrapper[4865]: I1205 06:10:54.891854 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:55 crc kubenswrapper[4865]: I1205 06:10:55.536347 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qxf4f"] Dec 05 06:10:55 crc kubenswrapper[4865]: W1205 06:10:55.557424 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd09f4d0d_afcd_4e4b_bab2_aa5b2f110947.slice/crio-913ea858072575a473b6bbe2445ace3f84296026ccc6fe2f3526a25192067df8 WatchSource:0}: Error finding container 913ea858072575a473b6bbe2445ace3f84296026ccc6fe2f3526a25192067df8: Status 404 returned error can't find the container with id 913ea858072575a473b6bbe2445ace3f84296026ccc6fe2f3526a25192067df8 Dec 05 06:10:56 crc kubenswrapper[4865]: I1205 06:10:56.265110 4865 generic.go:334] "Generic (PLEG): container finished" podID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerID="18ed71db1dfd58a39493995109e453a024d482c527319af90c4b0b98c037c58b" exitCode=0 Dec 05 06:10:56 crc kubenswrapper[4865]: I1205 06:10:56.265177 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" event={"ID":"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947","Type":"ContainerDied","Data":"18ed71db1dfd58a39493995109e453a024d482c527319af90c4b0b98c037c58b"} Dec 05 06:10:56 crc kubenswrapper[4865]: I1205 06:10:56.265237 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" event={"ID":"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947","Type":"ContainerStarted","Data":"913ea858072575a473b6bbe2445ace3f84296026ccc6fe2f3526a25192067df8"} Dec 05 06:10:57 crc kubenswrapper[4865]: I1205 06:10:57.280269 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" event={"ID":"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947","Type":"ContainerStarted","Data":"4a656a530bc6484b2337f17bc7e1704ff915e09f95702ccac5810519d132dc91"} Dec 05 06:10:57 crc kubenswrapper[4865]: I1205 06:10:57.281558 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:10:57 crc kubenswrapper[4865]: I1205 06:10:57.298865 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" podStartSLOduration=3.298846676 podStartE2EDuration="3.298846676s" podCreationTimestamp="2025-12-05 06:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:10:57.295004307 +0000 UTC m=+1076.575015529" watchObservedRunningTime="2025-12-05 06:10:57.298846676 +0000 UTC m=+1076.578857898" Dec 05 06:11:00 crc kubenswrapper[4865]: I1205 06:11:00.316289 4865 generic.go:334] "Generic (PLEG): container finished" podID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerID="49d67739f31cafcc87fc6c330cb79fbdef086b770232befb583085154eba0839" exitCode=0 Dec 05 06:11:00 crc kubenswrapper[4865]: I1205 06:11:00.316402 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3","Type":"ContainerDied","Data":"49d67739f31cafcc87fc6c330cb79fbdef086b770232befb583085154eba0839"} Dec 05 06:11:00 crc kubenswrapper[4865]: I1205 06:11:00.320101 4865 generic.go:334] "Generic (PLEG): container finished" podID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerID="c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef" exitCode=0 Dec 05 06:11:00 crc kubenswrapper[4865]: I1205 06:11:00.320157 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff4eacf2-62b6-48a0-9650-77e19a6db904","Type":"ContainerDied","Data":"c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef"} Dec 05 06:11:04 crc kubenswrapper[4865]: I1205 06:11:04.894097 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:11:04 crc kubenswrapper[4865]: I1205 06:11:04.952244 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hrq8n"] Dec 05 06:11:04 crc kubenswrapper[4865]: I1205 06:11:04.952539 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="dnsmasq-dns" containerID="cri-o://c7ce465cce771b87eafdce55819bd0e77d29d2739c5278e527c3bc6ffb43fd2c" gracePeriod=10 Dec 05 06:11:05 crc kubenswrapper[4865]: I1205 06:11:05.111745 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Dec 05 06:11:06 crc kubenswrapper[4865]: I1205 06:11:06.389117 4865 generic.go:334] "Generic (PLEG): container finished" podID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerID="c7ce465cce771b87eafdce55819bd0e77d29d2739c5278e527c3bc6ffb43fd2c" exitCode=0 Dec 05 06:11:06 crc kubenswrapper[4865]: I1205 06:11:06.389162 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" event={"ID":"2c8cda95-9251-4c67-90f3-0cc090868b6f","Type":"ContainerDied","Data":"c7ce465cce771b87eafdce55819bd0e77d29d2739c5278e527c3bc6ffb43fd2c"} Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.474379 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.648699 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-sb\") pod \"2c8cda95-9251-4c67-90f3-0cc090868b6f\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.648784 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccmzm\" (UniqueName: \"kubernetes.io/projected/2c8cda95-9251-4c67-90f3-0cc090868b6f-kube-api-access-ccmzm\") pod \"2c8cda95-9251-4c67-90f3-0cc090868b6f\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.648850 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-config\") pod \"2c8cda95-9251-4c67-90f3-0cc090868b6f\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.648881 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-dns-svc\") pod \"2c8cda95-9251-4c67-90f3-0cc090868b6f\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.648968 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-nb\") pod \"2c8cda95-9251-4c67-90f3-0cc090868b6f\" (UID: \"2c8cda95-9251-4c67-90f3-0cc090868b6f\") " Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.653497 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8cda95-9251-4c67-90f3-0cc090868b6f-kube-api-access-ccmzm" (OuterVolumeSpecName: "kube-api-access-ccmzm") pod "2c8cda95-9251-4c67-90f3-0cc090868b6f" (UID: "2c8cda95-9251-4c67-90f3-0cc090868b6f"). InnerVolumeSpecName "kube-api-access-ccmzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.701602 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-config" (OuterVolumeSpecName: "config") pod "2c8cda95-9251-4c67-90f3-0cc090868b6f" (UID: "2c8cda95-9251-4c67-90f3-0cc090868b6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.702985 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c8cda95-9251-4c67-90f3-0cc090868b6f" (UID: "2c8cda95-9251-4c67-90f3-0cc090868b6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.716020 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c8cda95-9251-4c67-90f3-0cc090868b6f" (UID: "2c8cda95-9251-4c67-90f3-0cc090868b6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.738355 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c8cda95-9251-4c67-90f3-0cc090868b6f" (UID: "2c8cda95-9251-4c67-90f3-0cc090868b6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.751239 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.751259 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccmzm\" (UniqueName: \"kubernetes.io/projected/2c8cda95-9251-4c67-90f3-0cc090868b6f-kube-api-access-ccmzm\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.751272 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.751281 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:07 crc kubenswrapper[4865]: I1205 06:11:07.751289 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c8cda95-9251-4c67-90f3-0cc090868b6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.444043 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" event={"ID":"2c8cda95-9251-4c67-90f3-0cc090868b6f","Type":"ContainerDied","Data":"7c5bca76f821305550cf43ce45bd89989a8e6314a4f290a694e788b8e60ad1c2"} Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.444498 4865 scope.go:117] "RemoveContainer" containerID="c7ce465cce771b87eafdce55819bd0e77d29d2739c5278e527c3bc6ffb43fd2c" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.444339 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hrq8n" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.450785 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dj2h4" event={"ID":"e6d4176e-7d68-48c6-9e9a-c507558508ab","Type":"ContainerStarted","Data":"dcd36b620cd531d364f2b49652887b92dafe53598709ef92028c21c00e70f2c4"} Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.465889 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3","Type":"ContainerStarted","Data":"e4dc6db7c977e358c17d555fa585da5768e5c4183459af9eccb20e6a88adfaf9"} Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.466177 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.488185 4865 scope.go:117] "RemoveContainer" containerID="153fd74b9be2b5b1e3788e708193984c3182264607700e1f502862150b4126fa" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.489593 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff4eacf2-62b6-48a0-9650-77e19a6db904","Type":"ContainerStarted","Data":"9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5"} Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.490076 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.502721 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dj2h4" podStartSLOduration=2.950528505 podStartE2EDuration="18.502699413s" podCreationTimestamp="2025-12-05 06:10:50 +0000 UTC" firstStartedPulling="2025-12-05 06:10:51.801731463 +0000 UTC m=+1071.081742685" lastFinishedPulling="2025-12-05 06:11:07.353902371 +0000 UTC m=+1086.633913593" observedRunningTime="2025-12-05 06:11:08.480135454 +0000 UTC m=+1087.760146686" watchObservedRunningTime="2025-12-05 06:11:08.502699413 +0000 UTC m=+1087.782710635" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.521287 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.466362236 podStartE2EDuration="1m33.521262707s" podCreationTimestamp="2025-12-05 06:09:35 +0000 UTC" firstStartedPulling="2025-12-05 06:09:37.597300594 +0000 UTC m=+996.877311816" lastFinishedPulling="2025-12-05 06:10:24.652201065 +0000 UTC m=+1043.932212287" observedRunningTime="2025-12-05 06:11:08.512875871 +0000 UTC m=+1087.792887093" watchObservedRunningTime="2025-12-05 06:11:08.521262707 +0000 UTC m=+1087.801273929" Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.548074 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hrq8n"] Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.557229 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hrq8n"] Dec 05 06:11:08 crc kubenswrapper[4865]: I1205 06:11:08.571800 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.727912578 podStartE2EDuration="1m33.571776966s" podCreationTimestamp="2025-12-05 06:09:35 +0000 UTC" firstStartedPulling="2025-12-05 06:09:37.811901797 +0000 UTC m=+997.091913019" lastFinishedPulling="2025-12-05 06:10:24.655766175 +0000 UTC m=+1043.935777407" observedRunningTime="2025-12-05 06:11:08.569336977 +0000 UTC m=+1087.849348219" watchObservedRunningTime="2025-12-05 06:11:08.571776966 +0000 UTC m=+1087.851788198" Dec 05 06:11:09 crc kubenswrapper[4865]: I1205 06:11:09.016315 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" path="/var/lib/kubelet/pods/2c8cda95-9251-4c67-90f3-0cc090868b6f/volumes" Dec 05 06:11:11 crc kubenswrapper[4865]: I1205 06:11:11.048545 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:11:11 crc kubenswrapper[4865]: I1205 06:11:11.048960 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:11:14 crc kubenswrapper[4865]: I1205 06:11:14.543124 4865 generic.go:334] "Generic (PLEG): container finished" podID="e6d4176e-7d68-48c6-9e9a-c507558508ab" containerID="dcd36b620cd531d364f2b49652887b92dafe53598709ef92028c21c00e70f2c4" exitCode=0 Dec 05 06:11:14 crc kubenswrapper[4865]: I1205 06:11:14.543233 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dj2h4" event={"ID":"e6d4176e-7d68-48c6-9e9a-c507558508ab","Type":"ContainerDied","Data":"dcd36b620cd531d364f2b49652887b92dafe53598709ef92028c21c00e70f2c4"} Dec 05 06:11:15 crc kubenswrapper[4865]: I1205 06:11:15.982104 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dj2h4" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.129460 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5m54\" (UniqueName: \"kubernetes.io/projected/e6d4176e-7d68-48c6-9e9a-c507558508ab-kube-api-access-f5m54\") pod \"e6d4176e-7d68-48c6-9e9a-c507558508ab\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.129502 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-config-data\") pod \"e6d4176e-7d68-48c6-9e9a-c507558508ab\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.129589 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-combined-ca-bundle\") pod \"e6d4176e-7d68-48c6-9e9a-c507558508ab\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.129623 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-db-sync-config-data\") pod \"e6d4176e-7d68-48c6-9e9a-c507558508ab\" (UID: \"e6d4176e-7d68-48c6-9e9a-c507558508ab\") " Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.136773 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e6d4176e-7d68-48c6-9e9a-c507558508ab" (UID: "e6d4176e-7d68-48c6-9e9a-c507558508ab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.138638 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d4176e-7d68-48c6-9e9a-c507558508ab-kube-api-access-f5m54" (OuterVolumeSpecName: "kube-api-access-f5m54") pod "e6d4176e-7d68-48c6-9e9a-c507558508ab" (UID: "e6d4176e-7d68-48c6-9e9a-c507558508ab"). InnerVolumeSpecName "kube-api-access-f5m54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.166063 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6d4176e-7d68-48c6-9e9a-c507558508ab" (UID: "e6d4176e-7d68-48c6-9e9a-c507558508ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.193339 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-config-data" (OuterVolumeSpecName: "config-data") pod "e6d4176e-7d68-48c6-9e9a-c507558508ab" (UID: "e6d4176e-7d68-48c6-9e9a-c507558508ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.231911 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5m54\" (UniqueName: \"kubernetes.io/projected/e6d4176e-7d68-48c6-9e9a-c507558508ab-kube-api-access-f5m54\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.231948 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.232144 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.232153 4865 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6d4176e-7d68-48c6-9e9a-c507558508ab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.560489 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dj2h4" event={"ID":"e6d4176e-7d68-48c6-9e9a-c507558508ab","Type":"ContainerDied","Data":"ec18096ba96c7cb97bcfc45e32e2f816423877dfe5e6a3cb9d6718981bc3d352"} Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.560866 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec18096ba96c7cb97bcfc45e32e2f816423877dfe5e6a3cb9d6718981bc3d352" Dec 05 06:11:16 crc kubenswrapper[4865]: I1205 06:11:16.560936 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dj2h4" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.019979 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-kf7z9"] Dec 05 06:11:17 crc kubenswrapper[4865]: E1205 06:11:17.020260 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d4176e-7d68-48c6-9e9a-c507558508ab" containerName="glance-db-sync" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.020273 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d4176e-7d68-48c6-9e9a-c507558508ab" containerName="glance-db-sync" Dec 05 06:11:17 crc kubenswrapper[4865]: E1205 06:11:17.020306 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="dnsmasq-dns" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.020312 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="dnsmasq-dns" Dec 05 06:11:17 crc kubenswrapper[4865]: E1205 06:11:17.020330 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="init" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.020335 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="init" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.020499 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8cda95-9251-4c67-90f3-0cc090868b6f" containerName="dnsmasq-dns" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.020518 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d4176e-7d68-48c6-9e9a-c507558508ab" containerName="glance-db-sync" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.023024 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.045728 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-kf7z9"] Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.149961 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.150033 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.150053 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.150188 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/6600be64-7e1e-4d51-a8df-8cb630b9af81-kube-api-access-gwmsg\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.150288 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-svc\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.150317 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-config\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.171997 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.251907 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/6600be64-7e1e-4d51-a8df-8cb630b9af81-kube-api-access-gwmsg\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.252024 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-svc\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.252056 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-config\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.252105 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.252135 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.252158 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.253592 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.257228 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-svc\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.257891 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-config\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.259073 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.259631 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.302277 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/6600be64-7e1e-4d51-a8df-8cb630b9af81-kube-api-access-gwmsg\") pod \"dnsmasq-dns-895cf5cf-kf7z9\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.341398 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:17 crc kubenswrapper[4865]: I1205 06:11:17.707643 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-kf7z9"] Dec 05 06:11:17 crc kubenswrapper[4865]: W1205 06:11:17.714677 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6600be64_7e1e_4d51_a8df_8cb630b9af81.slice/crio-5d2d561efbc73a3857b22e21170d03c40d30f3908ae7e92df277b833f574c07a WatchSource:0}: Error finding container 5d2d561efbc73a3857b22e21170d03c40d30f3908ae7e92df277b833f574c07a: Status 404 returned error can't find the container with id 5d2d561efbc73a3857b22e21170d03c40d30f3908ae7e92df277b833f574c07a Dec 05 06:11:18 crc kubenswrapper[4865]: I1205 06:11:18.579541 4865 generic.go:334] "Generic (PLEG): container finished" podID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerID="2e3329f5bbc3cad0322a9067aade8f131d1e2190f2e20c074b9694cde384d849" exitCode=0 Dec 05 06:11:18 crc kubenswrapper[4865]: I1205 06:11:18.579587 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" event={"ID":"6600be64-7e1e-4d51-a8df-8cb630b9af81","Type":"ContainerDied","Data":"2e3329f5bbc3cad0322a9067aade8f131d1e2190f2e20c074b9694cde384d849"} Dec 05 06:11:18 crc kubenswrapper[4865]: I1205 06:11:18.579875 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" event={"ID":"6600be64-7e1e-4d51-a8df-8cb630b9af81","Type":"ContainerStarted","Data":"5d2d561efbc73a3857b22e21170d03c40d30f3908ae7e92df277b833f574c07a"} Dec 05 06:11:19 crc kubenswrapper[4865]: I1205 06:11:19.599641 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" event={"ID":"6600be64-7e1e-4d51-a8df-8cb630b9af81","Type":"ContainerStarted","Data":"72ef5c16fd4daa3dfa79816d71e20a8833a8242effb872e1a5714a6b250c442c"} Dec 05 06:11:19 crc kubenswrapper[4865]: I1205 06:11:19.601410 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:19 crc kubenswrapper[4865]: I1205 06:11:19.650605 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" podStartSLOduration=3.6505734949999997 podStartE2EDuration="3.650573495s" podCreationTimestamp="2025-12-05 06:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:19.643931648 +0000 UTC m=+1098.923942870" watchObservedRunningTime="2025-12-05 06:11:19.650573495 +0000 UTC m=+1098.930584717" Dec 05 06:11:26 crc kubenswrapper[4865]: I1205 06:11:26.778027 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.210989 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ntj72"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.211974 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.228957 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ntj72"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.248420 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzt8l\" (UniqueName: \"kubernetes.io/projected/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-kube-api-access-rzt8l\") pod \"barbican-db-create-ntj72\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.248476 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-operator-scripts\") pod \"barbican-db-create-ntj72\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.307258 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bbd2l"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.314046 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.318431 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bbd2l"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.326644 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2426-account-create-update-jpx49"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.329027 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.332652 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.345734 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2426-account-create-update-jpx49"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.346888 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.349831 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-operator-scripts\") pod \"barbican-db-create-ntj72\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.349875 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfvn\" (UniqueName: \"kubernetes.io/projected/fde195ed-2c6b-465f-b496-97c7d604f1c6-kube-api-access-pkfvn\") pod \"cinder-db-create-bbd2l\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.349919 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-operator-scripts\") pod \"cinder-2426-account-create-update-jpx49\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.349951 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2c77\" (UniqueName: \"kubernetes.io/projected/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-kube-api-access-b2c77\") pod \"cinder-2426-account-create-update-jpx49\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.350101 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde195ed-2c6b-465f-b496-97c7d604f1c6-operator-scripts\") pod \"cinder-db-create-bbd2l\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.350123 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzt8l\" (UniqueName: \"kubernetes.io/projected/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-kube-api-access-rzt8l\") pod \"barbican-db-create-ntj72\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.351852 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-operator-scripts\") pod \"barbican-db-create-ntj72\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.385031 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzt8l\" (UniqueName: \"kubernetes.io/projected/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-kube-api-access-rzt8l\") pod \"barbican-db-create-ntj72\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.423928 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-16c7-account-create-update-w2r86"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.425132 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.428070 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.452723 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfvn\" (UniqueName: \"kubernetes.io/projected/fde195ed-2c6b-465f-b496-97c7d604f1c6-kube-api-access-pkfvn\") pod \"cinder-db-create-bbd2l\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.452778 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-operator-scripts\") pod \"cinder-2426-account-create-update-jpx49\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.452803 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2c77\" (UniqueName: \"kubernetes.io/projected/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-kube-api-access-b2c77\") pod \"cinder-2426-account-create-update-jpx49\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.452862 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-operator-scripts\") pod \"barbican-16c7-account-create-update-w2r86\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.452936 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftnwj\" (UniqueName: \"kubernetes.io/projected/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-kube-api-access-ftnwj\") pod \"barbican-16c7-account-create-update-w2r86\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.453009 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde195ed-2c6b-465f-b496-97c7d604f1c6-operator-scripts\") pod \"cinder-db-create-bbd2l\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.453664 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde195ed-2c6b-465f-b496-97c7d604f1c6-operator-scripts\") pod \"cinder-db-create-bbd2l\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.454662 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-operator-scripts\") pod \"cinder-2426-account-create-update-jpx49\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.477186 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2c77\" (UniqueName: \"kubernetes.io/projected/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-kube-api-access-b2c77\") pod \"cinder-2426-account-create-update-jpx49\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.487345 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfvn\" (UniqueName: \"kubernetes.io/projected/fde195ed-2c6b-465f-b496-97c7d604f1c6-kube-api-access-pkfvn\") pod \"cinder-db-create-bbd2l\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.489814 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qxf4f"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.490076 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerName="dnsmasq-dns" containerID="cri-o://4a656a530bc6484b2337f17bc7e1704ff915e09f95702ccac5810519d132dc91" gracePeriod=10 Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.531641 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.553629 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-16c7-account-create-update-w2r86"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.554527 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-operator-scripts\") pod \"barbican-16c7-account-create-update-w2r86\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.554591 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftnwj\" (UniqueName: \"kubernetes.io/projected/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-kube-api-access-ftnwj\") pod \"barbican-16c7-account-create-update-w2r86\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.555772 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-operator-scripts\") pod \"barbican-16c7-account-create-update-w2r86\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.597095 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mh5z9"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.598116 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.603467 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.603665 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.603950 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nq2gr" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.604062 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.615350 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftnwj\" (UniqueName: \"kubernetes.io/projected/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-kube-api-access-ftnwj\") pod \"barbican-16c7-account-create-update-w2r86\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.634481 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.634930 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mh5z9"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.658364 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k5d5\" (UniqueName: \"kubernetes.io/projected/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-kube-api-access-9k5d5\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.658846 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-config-data\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.658990 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-combined-ca-bundle\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.692733 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mf52x"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.707679 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.752524 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.779307 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k5d5\" (UniqueName: \"kubernetes.io/projected/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-kube-api-access-9k5d5\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.779368 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-config-data\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.779390 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-combined-ca-bundle\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.786745 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-config-data\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.790421 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mf52x"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.790670 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.814973 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-combined-ca-bundle\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.822245 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k5d5\" (UniqueName: \"kubernetes.io/projected/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-kube-api-access-9k5d5\") pod \"keystone-db-sync-mh5z9\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.850084 4865 generic.go:334] "Generic (PLEG): container finished" podID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerID="4a656a530bc6484b2337f17bc7e1704ff915e09f95702ccac5810519d132dc91" exitCode=0 Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.850135 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" event={"ID":"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947","Type":"ContainerDied","Data":"4a656a530bc6484b2337f17bc7e1704ff915e09f95702ccac5810519d132dc91"} Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.856116 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6eb1-account-create-update-czskk"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.857815 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.861133 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.900502 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6eb1-account-create-update-czskk"] Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.983361 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30202162-03c8-4d05-b31d-fbb7900ae067-operator-scripts\") pod \"neutron-6eb1-account-create-update-czskk\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.983657 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkq7k\" (UniqueName: \"kubernetes.io/projected/11931045-4e2b-4720-9f3b-745b2215e3ac-kube-api-access-rkq7k\") pod \"neutron-db-create-mf52x\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.983786 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqjch\" (UniqueName: \"kubernetes.io/projected/30202162-03c8-4d05-b31d-fbb7900ae067-kube-api-access-cqjch\") pod \"neutron-6eb1-account-create-update-czskk\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:27 crc kubenswrapper[4865]: I1205 06:11:27.983815 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11931045-4e2b-4720-9f3b-745b2215e3ac-operator-scripts\") pod \"neutron-db-create-mf52x\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.085892 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkq7k\" (UniqueName: \"kubernetes.io/projected/11931045-4e2b-4720-9f3b-745b2215e3ac-kube-api-access-rkq7k\") pod \"neutron-db-create-mf52x\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.085991 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqjch\" (UniqueName: \"kubernetes.io/projected/30202162-03c8-4d05-b31d-fbb7900ae067-kube-api-access-cqjch\") pod \"neutron-6eb1-account-create-update-czskk\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.086027 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11931045-4e2b-4720-9f3b-745b2215e3ac-operator-scripts\") pod \"neutron-db-create-mf52x\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.086071 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30202162-03c8-4d05-b31d-fbb7900ae067-operator-scripts\") pod \"neutron-6eb1-account-create-update-czskk\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.086814 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30202162-03c8-4d05-b31d-fbb7900ae067-operator-scripts\") pod \"neutron-6eb1-account-create-update-czskk\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.088662 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11931045-4e2b-4720-9f3b-745b2215e3ac-operator-scripts\") pod \"neutron-db-create-mf52x\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.100345 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.106262 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqjch\" (UniqueName: \"kubernetes.io/projected/30202162-03c8-4d05-b31d-fbb7900ae067-kube-api-access-cqjch\") pod \"neutron-6eb1-account-create-update-czskk\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.111050 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkq7k\" (UniqueName: \"kubernetes.io/projected/11931045-4e2b-4720-9f3b-745b2215e3ac-kube-api-access-rkq7k\") pod \"neutron-db-create-mf52x\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.147767 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.184283 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.246941 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ntj72"] Dec 05 06:11:28 crc kubenswrapper[4865]: W1205 06:11:28.261111 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75d26b7c_932e_4c37_95d3_9f5dc0b874b5.slice/crio-3d390c9eb4b2f6fd696c5aa50b8ee801377266235221b97f9aeaec415b29997d WatchSource:0}: Error finding container 3d390c9eb4b2f6fd696c5aa50b8ee801377266235221b97f9aeaec415b29997d: Status 404 returned error can't find the container with id 3d390c9eb4b2f6fd696c5aa50b8ee801377266235221b97f9aeaec415b29997d Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.406509 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bbd2l"] Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.784959 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.862968 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2426-account-create-update-jpx49"] Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.893567 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-16c7-account-create-update-w2r86"] Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.898790 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntj72" event={"ID":"75d26b7c-932e-4c37-95d3-9f5dc0b874b5","Type":"ContainerStarted","Data":"8e5c2d3d4e7df6172c4efc9f25b33f8a738a71181a274dfd788c0efdff9b041e"} Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.898842 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntj72" event={"ID":"75d26b7c-932e-4c37-95d3-9f5dc0b874b5","Type":"ContainerStarted","Data":"3d390c9eb4b2f6fd696c5aa50b8ee801377266235221b97f9aeaec415b29997d"} Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.909799 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdgv6\" (UniqueName: \"kubernetes.io/projected/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-kube-api-access-rdgv6\") pod \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.909877 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-sb\") pod \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.909945 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-nb\") pod \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.909985 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-svc\") pod \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.910068 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-config\") pod \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.910143 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-swift-storage-0\") pod \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\" (UID: \"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947\") " Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.928687 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bbd2l" event={"ID":"fde195ed-2c6b-465f-b496-97c7d604f1c6","Type":"ContainerStarted","Data":"bd379a7f707eb9e4bcd18d9f4d4e47111ad3d132718a50ba1a509fc5cb1dac96"} Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.928737 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bbd2l" event={"ID":"fde195ed-2c6b-465f-b496-97c7d604f1c6","Type":"ContainerStarted","Data":"8a3869e4622575c18fef3c44889faa0bdb16d8b9fe94637404a27408f9196b82"} Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.932031 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2426-account-create-update-jpx49" event={"ID":"49ad1f76-ae1d-4f90-9fcd-819dc43fc598","Type":"ContainerStarted","Data":"a92491ccfd18b8a7426e79065d655bbc2a32354e51b6a67308b0382f92bdeccd"} Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.937001 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-ntj72" podStartSLOduration=1.936977734 podStartE2EDuration="1.936977734s" podCreationTimestamp="2025-12-05 06:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:28.926116577 +0000 UTC m=+1108.206127819" watchObservedRunningTime="2025-12-05 06:11:28.936977734 +0000 UTC m=+1108.216988966" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.953155 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" event={"ID":"d09f4d0d-afcd-4e4b-bab2-aa5b2f110947","Type":"ContainerDied","Data":"913ea858072575a473b6bbe2445ace3f84296026ccc6fe2f3526a25192067df8"} Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.953206 4865 scope.go:117] "RemoveContainer" containerID="4a656a530bc6484b2337f17bc7e1704ff915e09f95702ccac5810519d132dc91" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.953341 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qxf4f" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.969244 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mh5z9"] Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.973962 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-bbd2l" podStartSLOduration=1.97394912 podStartE2EDuration="1.97394912s" podCreationTimestamp="2025-12-05 06:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:28.950513407 +0000 UTC m=+1108.230524629" watchObservedRunningTime="2025-12-05 06:11:28.97394912 +0000 UTC m=+1108.253960342" Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.995517 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mf52x"] Dec 05 06:11:28 crc kubenswrapper[4865]: I1205 06:11:28.998714 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-kube-api-access-rdgv6" (OuterVolumeSpecName: "kube-api-access-rdgv6") pod "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" (UID: "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947"). InnerVolumeSpecName "kube-api-access-rdgv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.013400 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdgv6\" (UniqueName: \"kubernetes.io/projected/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-kube-api-access-rdgv6\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.025069 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6eb1-account-create-update-czskk"] Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.080522 4865 scope.go:117] "RemoveContainer" containerID="18ed71db1dfd58a39493995109e453a024d482c527319af90c4b0b98c037c58b" Dec 05 06:11:29 crc kubenswrapper[4865]: W1205 06:11:29.080670 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11931045_4e2b_4720_9f3b_745b2215e3ac.slice/crio-b0d1109a7e6dcd2c5a1fa8ff0c32718acc9ce95a6aa4014ef85b1746b899f3a6 WatchSource:0}: Error finding container b0d1109a7e6dcd2c5a1fa8ff0c32718acc9ce95a6aa4014ef85b1746b899f3a6: Status 404 returned error can't find the container with id b0d1109a7e6dcd2c5a1fa8ff0c32718acc9ce95a6aa4014ef85b1746b899f3a6 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.131721 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" (UID: "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.138312 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" (UID: "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.156641 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" (UID: "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.161402 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" (UID: "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.165072 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-config" (OuterVolumeSpecName: "config") pod "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" (UID: "d09f4d0d-afcd-4e4b-bab2-aa5b2f110947"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.219695 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.219734 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.219745 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.219755 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.219763 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:29 crc kubenswrapper[4865]: E1205 06:11:29.239005 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfde195ed_2c6b_465f_b496_97c7d604f1c6.slice/crio-conmon-bd379a7f707eb9e4bcd18d9f4d4e47111ad3d132718a50ba1a509fc5cb1dac96.scope\": RecentStats: unable to find data in memory cache]" Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.296798 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qxf4f"] Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.309426 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qxf4f"] Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.967620 4865 generic.go:334] "Generic (PLEG): container finished" podID="49ad1f76-ae1d-4f90-9fcd-819dc43fc598" containerID="4dde5a9d8f8a9f7a0e3e18ebb1c3818bf01d24ffbff326e631bb27f89a42531c" exitCode=0 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.967728 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2426-account-create-update-jpx49" event={"ID":"49ad1f76-ae1d-4f90-9fcd-819dc43fc598","Type":"ContainerDied","Data":"4dde5a9d8f8a9f7a0e3e18ebb1c3818bf01d24ffbff326e631bb27f89a42531c"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.969343 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mh5z9" event={"ID":"2b061e42-22fc-4b4b-8ebe-20496b1a9f17","Type":"ContainerStarted","Data":"4791890009f17d1969588fdc0d8f1c2955ecf9371f45fb7a1493a780c5a41867"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.974325 4865 generic.go:334] "Generic (PLEG): container finished" podID="11931045-4e2b-4720-9f3b-745b2215e3ac" containerID="e36bca61d1c70996f3fe55847ad2ef9574f572837d618bf7422a89620a376f28" exitCode=0 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.974380 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mf52x" event={"ID":"11931045-4e2b-4720-9f3b-745b2215e3ac","Type":"ContainerDied","Data":"e36bca61d1c70996f3fe55847ad2ef9574f572837d618bf7422a89620a376f28"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.974402 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mf52x" event={"ID":"11931045-4e2b-4720-9f3b-745b2215e3ac","Type":"ContainerStarted","Data":"b0d1109a7e6dcd2c5a1fa8ff0c32718acc9ce95a6aa4014ef85b1746b899f3a6"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.976348 4865 generic.go:334] "Generic (PLEG): container finished" podID="75d26b7c-932e-4c37-95d3-9f5dc0b874b5" containerID="8e5c2d3d4e7df6172c4efc9f25b33f8a738a71181a274dfd788c0efdff9b041e" exitCode=0 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.976464 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntj72" event={"ID":"75d26b7c-932e-4c37-95d3-9f5dc0b874b5","Type":"ContainerDied","Data":"8e5c2d3d4e7df6172c4efc9f25b33f8a738a71181a274dfd788c0efdff9b041e"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.979388 4865 generic.go:334] "Generic (PLEG): container finished" podID="30202162-03c8-4d05-b31d-fbb7900ae067" containerID="b2dd81e91639317e8627af8ae3f58a29590afd371319684aff06671931431e02" exitCode=0 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.979427 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6eb1-account-create-update-czskk" event={"ID":"30202162-03c8-4d05-b31d-fbb7900ae067","Type":"ContainerDied","Data":"b2dd81e91639317e8627af8ae3f58a29590afd371319684aff06671931431e02"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.979441 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6eb1-account-create-update-czskk" event={"ID":"30202162-03c8-4d05-b31d-fbb7900ae067","Type":"ContainerStarted","Data":"d04ece54f60916cc4c894613783b287374847eec300128eccc96b8fe46315d40"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.981298 4865 generic.go:334] "Generic (PLEG): container finished" podID="fde195ed-2c6b-465f-b496-97c7d604f1c6" containerID="bd379a7f707eb9e4bcd18d9f4d4e47111ad3d132718a50ba1a509fc5cb1dac96" exitCode=0 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.981343 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bbd2l" event={"ID":"fde195ed-2c6b-465f-b496-97c7d604f1c6","Type":"ContainerDied","Data":"bd379a7f707eb9e4bcd18d9f4d4e47111ad3d132718a50ba1a509fc5cb1dac96"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.984287 4865 generic.go:334] "Generic (PLEG): container finished" podID="66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" containerID="7091a08c26e97d59579c6ea300531c523e6460ca30e5d89e01208cd1e166a674" exitCode=0 Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.984316 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-16c7-account-create-update-w2r86" event={"ID":"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad","Type":"ContainerDied","Data":"7091a08c26e97d59579c6ea300531c523e6460ca30e5d89e01208cd1e166a674"} Dec 05 06:11:29 crc kubenswrapper[4865]: I1205 06:11:29.984332 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-16c7-account-create-update-w2r86" event={"ID":"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad","Type":"ContainerStarted","Data":"a4ef00cc8eaf3dd3ef5eff5a28fd1d88cec76fafb96bc95f71bd1cb20d97fb29"} Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.025339 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" path="/var/lib/kubelet/pods/d09f4d0d-afcd-4e4b-bab2-aa5b2f110947/volumes" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.296888 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.463439 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkq7k\" (UniqueName: \"kubernetes.io/projected/11931045-4e2b-4720-9f3b-745b2215e3ac-kube-api-access-rkq7k\") pod \"11931045-4e2b-4720-9f3b-745b2215e3ac\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.463493 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11931045-4e2b-4720-9f3b-745b2215e3ac-operator-scripts\") pod \"11931045-4e2b-4720-9f3b-745b2215e3ac\" (UID: \"11931045-4e2b-4720-9f3b-745b2215e3ac\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.464500 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11931045-4e2b-4720-9f3b-745b2215e3ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11931045-4e2b-4720-9f3b-745b2215e3ac" (UID: "11931045-4e2b-4720-9f3b-745b2215e3ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.469956 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11931045-4e2b-4720-9f3b-745b2215e3ac-kube-api-access-rkq7k" (OuterVolumeSpecName: "kube-api-access-rkq7k") pod "11931045-4e2b-4720-9f3b-745b2215e3ac" (UID: "11931045-4e2b-4720-9f3b-745b2215e3ac"). InnerVolumeSpecName "kube-api-access-rkq7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.565864 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkq7k\" (UniqueName: \"kubernetes.io/projected/11931045-4e2b-4720-9f3b-745b2215e3ac-kube-api-access-rkq7k\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.565898 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11931045-4e2b-4720-9f3b-745b2215e3ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.577593 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.587781 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.627622 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.668533 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.688864 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771173 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqjch\" (UniqueName: \"kubernetes.io/projected/30202162-03c8-4d05-b31d-fbb7900ae067-kube-api-access-cqjch\") pod \"30202162-03c8-4d05-b31d-fbb7900ae067\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771231 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30202162-03c8-4d05-b31d-fbb7900ae067-operator-scripts\") pod \"30202162-03c8-4d05-b31d-fbb7900ae067\" (UID: \"30202162-03c8-4d05-b31d-fbb7900ae067\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771269 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftnwj\" (UniqueName: \"kubernetes.io/projected/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-kube-api-access-ftnwj\") pod \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771320 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-operator-scripts\") pod \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771349 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzt8l\" (UniqueName: \"kubernetes.io/projected/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-kube-api-access-rzt8l\") pod \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771416 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-operator-scripts\") pod \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\" (UID: \"75d26b7c-932e-4c37-95d3-9f5dc0b874b5\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771438 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-operator-scripts\") pod \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\" (UID: \"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771526 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2c77\" (UniqueName: \"kubernetes.io/projected/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-kube-api-access-b2c77\") pod \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\" (UID: \"49ad1f76-ae1d-4f90-9fcd-819dc43fc598\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771779 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49ad1f76-ae1d-4f90-9fcd-819dc43fc598" (UID: "49ad1f76-ae1d-4f90-9fcd-819dc43fc598"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.771869 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30202162-03c8-4d05-b31d-fbb7900ae067-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30202162-03c8-4d05-b31d-fbb7900ae067" (UID: "30202162-03c8-4d05-b31d-fbb7900ae067"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.772011 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75d26b7c-932e-4c37-95d3-9f5dc0b874b5" (UID: "75d26b7c-932e-4c37-95d3-9f5dc0b874b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.772020 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" (UID: "66a615ef-1dfa-45c7-8d5d-bb694d7d13ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.772638 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.772668 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.772681 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30202162-03c8-4d05-b31d-fbb7900ae067-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.772697 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.774572 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30202162-03c8-4d05-b31d-fbb7900ae067-kube-api-access-cqjch" (OuterVolumeSpecName: "kube-api-access-cqjch") pod "30202162-03c8-4d05-b31d-fbb7900ae067" (UID: "30202162-03c8-4d05-b31d-fbb7900ae067"). InnerVolumeSpecName "kube-api-access-cqjch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.774607 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-kube-api-access-ftnwj" (OuterVolumeSpecName: "kube-api-access-ftnwj") pod "66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" (UID: "66a615ef-1dfa-45c7-8d5d-bb694d7d13ad"). InnerVolumeSpecName "kube-api-access-ftnwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.775907 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-kube-api-access-b2c77" (OuterVolumeSpecName: "kube-api-access-b2c77") pod "49ad1f76-ae1d-4f90-9fcd-819dc43fc598" (UID: "49ad1f76-ae1d-4f90-9fcd-819dc43fc598"). InnerVolumeSpecName "kube-api-access-b2c77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.776356 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-kube-api-access-rzt8l" (OuterVolumeSpecName: "kube-api-access-rzt8l") pod "75d26b7c-932e-4c37-95d3-9f5dc0b874b5" (UID: "75d26b7c-932e-4c37-95d3-9f5dc0b874b5"). InnerVolumeSpecName "kube-api-access-rzt8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.873764 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfvn\" (UniqueName: \"kubernetes.io/projected/fde195ed-2c6b-465f-b496-97c7d604f1c6-kube-api-access-pkfvn\") pod \"fde195ed-2c6b-465f-b496-97c7d604f1c6\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.873816 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde195ed-2c6b-465f-b496-97c7d604f1c6-operator-scripts\") pod \"fde195ed-2c6b-465f-b496-97c7d604f1c6\" (UID: \"fde195ed-2c6b-465f-b496-97c7d604f1c6\") " Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.874381 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqjch\" (UniqueName: \"kubernetes.io/projected/30202162-03c8-4d05-b31d-fbb7900ae067-kube-api-access-cqjch\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.874404 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftnwj\" (UniqueName: \"kubernetes.io/projected/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad-kube-api-access-ftnwj\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.874417 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzt8l\" (UniqueName: \"kubernetes.io/projected/75d26b7c-932e-4c37-95d3-9f5dc0b874b5-kube-api-access-rzt8l\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.874427 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2c77\" (UniqueName: \"kubernetes.io/projected/49ad1f76-ae1d-4f90-9fcd-819dc43fc598-kube-api-access-b2c77\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.874607 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde195ed-2c6b-465f-b496-97c7d604f1c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fde195ed-2c6b-465f-b496-97c7d604f1c6" (UID: "fde195ed-2c6b-465f-b496-97c7d604f1c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.876459 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde195ed-2c6b-465f-b496-97c7d604f1c6-kube-api-access-pkfvn" (OuterVolumeSpecName: "kube-api-access-pkfvn") pod "fde195ed-2c6b-465f-b496-97c7d604f1c6" (UID: "fde195ed-2c6b-465f-b496-97c7d604f1c6"). InnerVolumeSpecName "kube-api-access-pkfvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.975774 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde195ed-2c6b-465f-b496-97c7d604f1c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:31 crc kubenswrapper[4865]: I1205 06:11:31.976060 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfvn\" (UniqueName: \"kubernetes.io/projected/fde195ed-2c6b-465f-b496-97c7d604f1c6-kube-api-access-pkfvn\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.008566 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ntj72" event={"ID":"75d26b7c-932e-4c37-95d3-9f5dc0b874b5","Type":"ContainerDied","Data":"3d390c9eb4b2f6fd696c5aa50b8ee801377266235221b97f9aeaec415b29997d"} Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.008806 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d390c9eb4b2f6fd696c5aa50b8ee801377266235221b97f9aeaec415b29997d" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.008897 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ntj72" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.016883 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6eb1-account-create-update-czskk" event={"ID":"30202162-03c8-4d05-b31d-fbb7900ae067","Type":"ContainerDied","Data":"d04ece54f60916cc4c894613783b287374847eec300128eccc96b8fe46315d40"} Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.016950 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d04ece54f60916cc4c894613783b287374847eec300128eccc96b8fe46315d40" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.016923 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6eb1-account-create-update-czskk" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.020779 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bbd2l" event={"ID":"fde195ed-2c6b-465f-b496-97c7d604f1c6","Type":"ContainerDied","Data":"8a3869e4622575c18fef3c44889faa0bdb16d8b9fe94637404a27408f9196b82"} Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.020814 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bbd2l" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.020834 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a3869e4622575c18fef3c44889faa0bdb16d8b9fe94637404a27408f9196b82" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.023581 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-16c7-account-create-update-w2r86" event={"ID":"66a615ef-1dfa-45c7-8d5d-bb694d7d13ad","Type":"ContainerDied","Data":"a4ef00cc8eaf3dd3ef5eff5a28fd1d88cec76fafb96bc95f71bd1cb20d97fb29"} Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.023622 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-16c7-account-create-update-w2r86" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.023624 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4ef00cc8eaf3dd3ef5eff5a28fd1d88cec76fafb96bc95f71bd1cb20d97fb29" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.030911 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2426-account-create-update-jpx49" event={"ID":"49ad1f76-ae1d-4f90-9fcd-819dc43fc598","Type":"ContainerDied","Data":"a92491ccfd18b8a7426e79065d655bbc2a32354e51b6a67308b0382f92bdeccd"} Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.030956 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92491ccfd18b8a7426e79065d655bbc2a32354e51b6a67308b0382f92bdeccd" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.031005 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2426-account-create-update-jpx49" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.041846 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mf52x" event={"ID":"11931045-4e2b-4720-9f3b-745b2215e3ac","Type":"ContainerDied","Data":"b0d1109a7e6dcd2c5a1fa8ff0c32718acc9ce95a6aa4014ef85b1746b899f3a6"} Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.041885 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mf52x" Dec 05 06:11:32 crc kubenswrapper[4865]: I1205 06:11:32.041887 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d1109a7e6dcd2c5a1fa8ff0c32718acc9ce95a6aa4014ef85b1746b899f3a6" Dec 05 06:11:36 crc kubenswrapper[4865]: I1205 06:11:36.076740 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mh5z9" event={"ID":"2b061e42-22fc-4b4b-8ebe-20496b1a9f17","Type":"ContainerStarted","Data":"f8e43214c4c3a9e4ed8bbd0e1bce8943bef1b7b3c16c4984f7e3aa526397e66f"} Dec 05 06:11:36 crc kubenswrapper[4865]: I1205 06:11:36.105228 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mh5z9" podStartSLOduration=3.390133007 podStartE2EDuration="9.105206726s" podCreationTimestamp="2025-12-05 06:11:27 +0000 UTC" firstStartedPulling="2025-12-05 06:11:29.079583959 +0000 UTC m=+1108.359595181" lastFinishedPulling="2025-12-05 06:11:34.794657638 +0000 UTC m=+1114.074668900" observedRunningTime="2025-12-05 06:11:36.09543794 +0000 UTC m=+1115.375449182" watchObservedRunningTime="2025-12-05 06:11:36.105206726 +0000 UTC m=+1115.385217948" Dec 05 06:11:39 crc kubenswrapper[4865]: I1205 06:11:39.109937 4865 generic.go:334] "Generic (PLEG): container finished" podID="2b061e42-22fc-4b4b-8ebe-20496b1a9f17" containerID="f8e43214c4c3a9e4ed8bbd0e1bce8943bef1b7b3c16c4984f7e3aa526397e66f" exitCode=0 Dec 05 06:11:39 crc kubenswrapper[4865]: I1205 06:11:39.110124 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mh5z9" event={"ID":"2b061e42-22fc-4b4b-8ebe-20496b1a9f17","Type":"ContainerDied","Data":"f8e43214c4c3a9e4ed8bbd0e1bce8943bef1b7b3c16c4984f7e3aa526397e66f"} Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.451748 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.634783 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-combined-ca-bundle\") pod \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.635067 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-config-data\") pod \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.635162 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k5d5\" (UniqueName: \"kubernetes.io/projected/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-kube-api-access-9k5d5\") pod \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\" (UID: \"2b061e42-22fc-4b4b-8ebe-20496b1a9f17\") " Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.653408 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-kube-api-access-9k5d5" (OuterVolumeSpecName: "kube-api-access-9k5d5") pod "2b061e42-22fc-4b4b-8ebe-20496b1a9f17" (UID: "2b061e42-22fc-4b4b-8ebe-20496b1a9f17"). InnerVolumeSpecName "kube-api-access-9k5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.691601 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b061e42-22fc-4b4b-8ebe-20496b1a9f17" (UID: "2b061e42-22fc-4b4b-8ebe-20496b1a9f17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.707363 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-config-data" (OuterVolumeSpecName: "config-data") pod "2b061e42-22fc-4b4b-8ebe-20496b1a9f17" (UID: "2b061e42-22fc-4b4b-8ebe-20496b1a9f17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.737724 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.737776 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k5d5\" (UniqueName: \"kubernetes.io/projected/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-kube-api-access-9k5d5\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:40 crc kubenswrapper[4865]: I1205 06:11:40.737791 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b061e42-22fc-4b4b-8ebe-20496b1a9f17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.050349 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.050989 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.133728 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mh5z9" event={"ID":"2b061e42-22fc-4b4b-8ebe-20496b1a9f17","Type":"ContainerDied","Data":"4791890009f17d1969588fdc0d8f1c2955ecf9371f45fb7a1493a780c5a41867"} Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.133812 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4791890009f17d1969588fdc0d8f1c2955ecf9371f45fb7a1493a780c5a41867" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.133882 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mh5z9" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.417638 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mczg5"] Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418018 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418036 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418047 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11931045-4e2b-4720-9f3b-745b2215e3ac" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418055 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="11931045-4e2b-4720-9f3b-745b2215e3ac" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418065 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d26b7c-932e-4c37-95d3-9f5dc0b874b5" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418072 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d26b7c-932e-4c37-95d3-9f5dc0b874b5" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418087 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerName="dnsmasq-dns" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418093 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerName="dnsmasq-dns" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418107 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde195ed-2c6b-465f-b496-97c7d604f1c6" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418113 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde195ed-2c6b-465f-b496-97c7d604f1c6" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418127 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerName="init" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418133 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerName="init" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418144 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ad1f76-ae1d-4f90-9fcd-819dc43fc598" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418152 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ad1f76-ae1d-4f90-9fcd-819dc43fc598" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418161 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b061e42-22fc-4b4b-8ebe-20496b1a9f17" containerName="keystone-db-sync" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418170 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b061e42-22fc-4b4b-8ebe-20496b1a9f17" containerName="keystone-db-sync" Dec 05 06:11:41 crc kubenswrapper[4865]: E1205 06:11:41.418177 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30202162-03c8-4d05-b31d-fbb7900ae067" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418183 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="30202162-03c8-4d05-b31d-fbb7900ae067" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418352 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="30202162-03c8-4d05-b31d-fbb7900ae067" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418365 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ad1f76-ae1d-4f90-9fcd-819dc43fc598" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418374 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d26b7c-932e-4c37-95d3-9f5dc0b874b5" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418388 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d09f4d0d-afcd-4e4b-bab2-aa5b2f110947" containerName="dnsmasq-dns" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418399 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde195ed-2c6b-465f-b496-97c7d604f1c6" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418412 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" containerName="mariadb-account-create-update" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418423 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="11931045-4e2b-4720-9f3b-745b2215e3ac" containerName="mariadb-database-create" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.418435 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b061e42-22fc-4b4b-8ebe-20496b1a9f17" containerName="keystone-db-sync" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.419091 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.426206 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.426301 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nq2gr" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.426444 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.426454 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.426683 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.437516 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mczg5"] Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.446560 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rfnjt"] Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.448014 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.480706 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rfnjt"] Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.556348 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.556632 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-config-data\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.556769 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf82f\" (UniqueName: \"kubernetes.io/projected/fb46cc5d-4c21-482c-85f5-76e1726d0d99-kube-api-access-kf82f\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.556901 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557025 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557119 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-fernet-keys\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557216 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-credential-keys\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557344 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557457 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv64f\" (UniqueName: \"kubernetes.io/projected/774b2338-1b59-4df3-ac6f-939724c29231-kube-api-access-fv64f\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557564 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-combined-ca-bundle\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.557964 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-scripts\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.558118 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-config\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659262 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659314 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-config-data\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659338 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf82f\" (UniqueName: \"kubernetes.io/projected/fb46cc5d-4c21-482c-85f5-76e1726d0d99-kube-api-access-kf82f\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659356 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659388 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659404 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-fernet-keys\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659425 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-credential-keys\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659464 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659492 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv64f\" (UniqueName: \"kubernetes.io/projected/774b2338-1b59-4df3-ac6f-939724c29231-kube-api-access-fv64f\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659510 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-combined-ca-bundle\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659531 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-scripts\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.659562 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-config\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.660233 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.660426 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-config\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.661070 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.665990 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.668586 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-scripts\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.668993 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.671445 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-credential-keys\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.672539 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-combined-ca-bundle\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.680180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-config-data\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.690862 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-fernet-keys\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.692668 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf82f\" (UniqueName: \"kubernetes.io/projected/fb46cc5d-4c21-482c-85f5-76e1726d0d99-kube-api-access-kf82f\") pod \"dnsmasq-dns-6c9c9f998c-rfnjt\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.698502 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv64f\" (UniqueName: \"kubernetes.io/projected/774b2338-1b59-4df3-ac6f-939724c29231-kube-api-access-fv64f\") pod \"keystone-bootstrap-mczg5\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.739197 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.766145 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.847434 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f7d5b"] Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.848960 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.874641 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f7d5b"] Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.876294 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.879685 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.889336 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-585c4d6d99-pjvhk"] Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.899586 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.908340 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pgfdn" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.915498 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.915698 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ks8zr" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.915864 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.916004 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.965989 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-db-sync-config-data\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.966034 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-scripts\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.966099 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5bv\" (UniqueName: \"kubernetes.io/projected/b5e4dce7-c9e7-4813-a957-1df502644792-kube-api-access-kh5bv\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.966124 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-combined-ca-bundle\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.966150 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5e4dce7-c9e7-4813-a957-1df502644792-etc-machine-id\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:41 crc kubenswrapper[4865]: I1205 06:11:41.966170 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-config-data\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.007892 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-585c4d6d99-pjvhk"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.065862 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-g8vbt"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.066946 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073286 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698c990e-e8b5-4c1b-8177-4e44874f4a44-logs\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073337 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5bv\" (UniqueName: \"kubernetes.io/projected/b5e4dce7-c9e7-4813-a957-1df502644792-kube-api-access-kh5bv\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073370 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-combined-ca-bundle\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073392 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hs9\" (UniqueName: \"kubernetes.io/projected/698c990e-e8b5-4c1b-8177-4e44874f4a44-kube-api-access-t7hs9\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073424 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5e4dce7-c9e7-4813-a957-1df502644792-etc-machine-id\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073444 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-config-data\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073506 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/698c990e-e8b5-4c1b-8177-4e44874f4a44-horizon-secret-key\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073527 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-scripts\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.073579 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-db-sync-config-data\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.074204 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-scripts\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.074237 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-config-data\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.075004 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.075231 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.075399 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d7f7p" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.075567 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5e4dce7-c9e7-4813-a957-1df502644792-etc-machine-id\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.081206 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-db-sync-config-data\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.081382 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-config-data\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.087898 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-scripts\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.090000 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g8vbt"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.090752 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-combined-ca-bundle\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.177891 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-config-data\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.178547 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-config\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.178654 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-combined-ca-bundle\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.178744 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qftn\" (UniqueName: \"kubernetes.io/projected/4021310b-c06b-44b3-9c95-7ca10552da10-kube-api-access-2qftn\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.178895 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698c990e-e8b5-4c1b-8177-4e44874f4a44-logs\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.179073 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hs9\" (UniqueName: \"kubernetes.io/projected/698c990e-e8b5-4c1b-8177-4e44874f4a44-kube-api-access-t7hs9\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.179307 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/698c990e-e8b5-4c1b-8177-4e44874f4a44-horizon-secret-key\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.179352 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-scripts\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.184341 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698c990e-e8b5-4c1b-8177-4e44874f4a44-logs\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.188250 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-scripts\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.193628 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/698c990e-e8b5-4c1b-8177-4e44874f4a44-horizon-secret-key\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.198937 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-config-data\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.206783 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mn58x"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.211572 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5bv\" (UniqueName: \"kubernetes.io/projected/b5e4dce7-c9e7-4813-a957-1df502644792-kube-api-access-kh5bv\") pod \"cinder-db-sync-f7d5b\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.225608 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.242719 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.243000 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tthks" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.242864 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hs9\" (UniqueName: \"kubernetes.io/projected/698c990e-e8b5-4c1b-8177-4e44874f4a44-kube-api-access-t7hs9\") pod \"horizon-585c4d6d99-pjvhk\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.263138 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.270214 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.281749 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-config\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.281796 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-combined-ca-bundle\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.281837 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qftn\" (UniqueName: \"kubernetes.io/projected/4021310b-c06b-44b3-9c95-7ca10552da10-kube-api-access-2qftn\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.293359 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-combined-ca-bundle\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.300559 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-config\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.300937 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rfnjt"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.334899 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mn58x"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.353745 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qftn\" (UniqueName: \"kubernetes.io/projected/4021310b-c06b-44b3-9c95-7ca10552da10-kube-api-access-2qftn\") pod \"neutron-db-sync-g8vbt\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.361802 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xs9sp"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.363113 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.371857 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.372478 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wwz5j" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.372734 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.385563 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.386094 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-combined-ca-bundle\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.386151 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-db-sync-config-data\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.386192 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dd2\" (UniqueName: \"kubernetes.io/projected/1de4159c-2d90-4b3a-bcff-84f293a59c35-kube-api-access-z5dd2\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.399288 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bzcl4"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.401017 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.428521 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xs9sp"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.449238 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bzcl4"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.480902 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.482402 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.485249 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64949fcd59-km7qf"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.486796 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.493377 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7rxpt" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.493723 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.493954 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494416 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwnkp\" (UniqueName: \"kubernetes.io/projected/4f798138-a4f1-490f-8904-cfccbf0db793-kube-api-access-hwnkp\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494542 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-combined-ca-bundle\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494620 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-config\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494704 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-scripts\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494809 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-db-sync-config-data\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494943 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.495030 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5dd2\" (UniqueName: \"kubernetes.io/projected/1de4159c-2d90-4b3a-bcff-84f293a59c35-kube-api-access-z5dd2\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.495119 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.495200 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-combined-ca-bundle\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.495683 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f798138-a4f1-490f-8904-cfccbf0db793-logs\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.494733 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.501093 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.507476 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-config-data\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.507547 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.507580 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.507646 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j8fk\" (UniqueName: \"kubernetes.io/projected/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-kube-api-access-4j8fk\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.515438 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-db-sync-config-data\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.519663 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-combined-ca-bundle\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.532734 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5dd2\" (UniqueName: \"kubernetes.io/projected/1de4159c-2d90-4b3a-bcff-84f293a59c35-kube-api-access-z5dd2\") pod \"barbican-db-sync-mn58x\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.541717 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64949fcd59-km7qf"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.612780 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-config\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.612865 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-scripts\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.612928 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-config-data\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.612959 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.612997 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613023 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613045 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-combined-ca-bundle\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613075 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-scripts\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613105 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-scripts\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613121 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-config-data\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613137 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f798138-a4f1-490f-8904-cfccbf0db793-logs\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613158 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613187 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613215 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-config-data\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613231 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613251 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwbq\" (UniqueName: \"kubernetes.io/projected/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-kube-api-access-ljwbq\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613269 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613287 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j8fk\" (UniqueName: \"kubernetes.io/projected/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-kube-api-access-4j8fk\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613312 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-logs\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613330 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-logs\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613359 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86qq\" (UniqueName: \"kubernetes.io/projected/affd9432-ce69-4ef7-8fee-6c8ac08aa659-kube-api-access-s86qq\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613394 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwnkp\" (UniqueName: \"kubernetes.io/projected/4f798138-a4f1-490f-8904-cfccbf0db793-kube-api-access-hwnkp\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613419 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-horizon-secret-key\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.613448 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.615846 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-config\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.617648 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.618321 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f798138-a4f1-490f-8904-cfccbf0db793-logs\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.618904 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.618953 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mn58x" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.619425 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.621026 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.622188 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.639290 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-scripts\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.645972 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-combined-ca-bundle\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.655381 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.662607 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-config-data\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.665046 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.665157 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.667860 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwnkp\" (UniqueName: \"kubernetes.io/projected/4f798138-a4f1-490f-8904-cfccbf0db793-kube-api-access-hwnkp\") pod \"placement-db-sync-xs9sp\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.698406 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xs9sp" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.702578 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.719740 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.719953 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.719977 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720037 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-scripts\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720067 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-scripts\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720123 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-config-data\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720143 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720207 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720280 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwbq\" (UniqueName: \"kubernetes.io/projected/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-kube-api-access-ljwbq\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720353 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-logs\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720382 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720441 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-logs\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720467 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720515 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720540 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86qq\" (UniqueName: \"kubernetes.io/projected/affd9432-ce69-4ef7-8fee-6c8ac08aa659-kube-api-access-s86qq\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720601 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720631 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-horizon-secret-key\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720681 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9274t\" (UniqueName: \"kubernetes.io/projected/e69363ce-e52c-4aaf-afae-b45fc7a238c6-kube-api-access-9274t\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720705 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720788 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-config-data\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.720844 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.731819 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-scripts\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.735650 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-logs\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.758290 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.758566 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.759188 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-config-data\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.759762 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-logs\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.762291 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.763605 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.790264 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j8fk\" (UniqueName: \"kubernetes.io/projected/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-kube-api-access-4j8fk\") pod \"dnsmasq-dns-57c957c4ff-bzcl4\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.793165 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-horizon-secret-key\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.793217 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-scripts\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.795731 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.797672 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-config-data\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.798992 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.811568 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.811938 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.813202 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.821533 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86qq\" (UniqueName: \"kubernetes.io/projected/affd9432-ce69-4ef7-8fee-6c8ac08aa659-kube-api-access-s86qq\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822492 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9274t\" (UniqueName: \"kubernetes.io/projected/e69363ce-e52c-4aaf-afae-b45fc7a238c6-kube-api-access-9274t\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822554 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822580 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822600 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822680 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822702 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822725 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.822753 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.823128 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.823138 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.823263 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.838753 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.842686 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.843264 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.849345 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.862389 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.881373 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.899205 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwbq\" (UniqueName: \"kubernetes.io/projected/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-kube-api-access-ljwbq\") pod \"horizon-64949fcd59-km7qf\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.902466 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9274t\" (UniqueName: \"kubernetes.io/projected/e69363ce-e52c-4aaf-afae-b45fc7a238c6-kube-api-access-9274t\") pod \"glance-default-internal-api-0\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.902590 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.933891 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.933935 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-scripts\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.933970 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-run-httpd\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.933990 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.934009 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-config-data\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.934039 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2st\" (UniqueName: \"kubernetes.io/projected/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-kube-api-access-sn2st\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:42 crc kubenswrapper[4865]: I1205 06:11:42.934052 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-log-httpd\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.016348 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.036793 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.036861 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-scripts\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.036899 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-run-httpd\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.036921 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.036945 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-config-data\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.036982 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2st\" (UniqueName: \"kubernetes.io/projected/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-kube-api-access-sn2st\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.037001 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-log-httpd\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.037690 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-log-httpd\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.043305 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-run-httpd\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.050750 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-scripts\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.132355 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.133433 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-config-data\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.133792 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.153626 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.159623 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mczg5"] Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.176762 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.224620 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2st\" (UniqueName: \"kubernetes.io/projected/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-kube-api-access-sn2st\") pod \"ceilometer-0\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.234751 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mczg5" event={"ID":"774b2338-1b59-4df3-ac6f-939724c29231","Type":"ContainerStarted","Data":"aa0ad94227755e8d5a290720f6be9a7528f2c06249d6e78a16548919ee454de4"} Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.369277 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.470184 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f7d5b"] Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.514894 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rfnjt"] Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.566578 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g8vbt"] Dec 05 06:11:43 crc kubenswrapper[4865]: W1205 06:11:43.575330 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e4dce7_c9e7_4813_a957_1df502644792.slice/crio-bf41e3f47c78f15d8a83a70ed918b306484e815d568941424cec5a9cb9093ae9 WatchSource:0}: Error finding container bf41e3f47c78f15d8a83a70ed918b306484e815d568941424cec5a9cb9093ae9: Status 404 returned error can't find the container with id bf41e3f47c78f15d8a83a70ed918b306484e815d568941424cec5a9cb9093ae9 Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.616008 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-585c4d6d99-pjvhk"] Dec 05 06:11:43 crc kubenswrapper[4865]: I1205 06:11:43.861635 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xs9sp"] Dec 05 06:11:43 crc kubenswrapper[4865]: W1205 06:11:43.878353 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f798138_a4f1_490f_8904_cfccbf0db793.slice/crio-d71e34d52561bb90448f19a0adf2a8fe5ca14d211a0fb73d9e7ebac5cc05cf9b WatchSource:0}: Error finding container d71e34d52561bb90448f19a0adf2a8fe5ca14d211a0fb73d9e7ebac5cc05cf9b: Status 404 returned error can't find the container with id d71e34d52561bb90448f19a0adf2a8fe5ca14d211a0fb73d9e7ebac5cc05cf9b Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.016489 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mn58x"] Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.158575 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64949fcd59-km7qf"] Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.251756 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7d5b" event={"ID":"b5e4dce7-c9e7-4813-a957-1df502644792","Type":"ContainerStarted","Data":"bf41e3f47c78f15d8a83a70ed918b306484e815d568941424cec5a9cb9093ae9"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.264532 4865 generic.go:334] "Generic (PLEG): container finished" podID="fb46cc5d-4c21-482c-85f5-76e1726d0d99" containerID="343046cd9ecb87be0b3a5086d11e16799aeab170bfc0223e0434a4304bfb219d" exitCode=0 Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.265854 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" event={"ID":"fb46cc5d-4c21-482c-85f5-76e1726d0d99","Type":"ContainerDied","Data":"343046cd9ecb87be0b3a5086d11e16799aeab170bfc0223e0434a4304bfb219d"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.265905 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" event={"ID":"fb46cc5d-4c21-482c-85f5-76e1726d0d99","Type":"ContainerStarted","Data":"c8b55244b2f0fb8ce93d591084c06a86800025aa63417c6781047d3f3f896a92"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.275243 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-585c4d6d99-pjvhk" event={"ID":"698c990e-e8b5-4c1b-8177-4e44874f4a44","Type":"ContainerStarted","Data":"d69f591385a79cea2dff5928f85ae63668916d9c66f754f753f5abec1e359335"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.282097 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64949fcd59-km7qf" event={"ID":"f25f8ba1-4e54-4919-9fdf-b7d7abf18572","Type":"ContainerStarted","Data":"286a1ad523bb32f7076815da56602668e9d663b5de7f8c90a5778a341f44bcf0"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.298253 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mczg5" event={"ID":"774b2338-1b59-4df3-ac6f-939724c29231","Type":"ContainerStarted","Data":"7815231a3951ecc27a948b95402d899cce2a6aa10c5addd5981f6d55a4c1efe9"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.328003 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xs9sp" event={"ID":"4f798138-a4f1-490f-8904-cfccbf0db793","Type":"ContainerStarted","Data":"d71e34d52561bb90448f19a0adf2a8fe5ca14d211a0fb73d9e7ebac5cc05cf9b"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.342666 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mczg5" podStartSLOduration=3.342625946 podStartE2EDuration="3.342625946s" podCreationTimestamp="2025-12-05 06:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:44.327790017 +0000 UTC m=+1123.607801239" watchObservedRunningTime="2025-12-05 06:11:44.342625946 +0000 UTC m=+1123.622637168" Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.355422 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8vbt" event={"ID":"4021310b-c06b-44b3-9c95-7ca10552da10","Type":"ContainerStarted","Data":"56ab844b39f7f229e643ac1e2e555f59ffc130412970cd5a6d57b22a5128c88c"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.355465 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8vbt" event={"ID":"4021310b-c06b-44b3-9c95-7ca10552da10","Type":"ContainerStarted","Data":"46df351c6eda26cd6c8cb415679bc2a9cd577421125c27052d20d4024f3ede35"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.376321 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mn58x" event={"ID":"1de4159c-2d90-4b3a-bcff-84f293a59c35","Type":"ContainerStarted","Data":"1c28652e943ec547456885c1223bec0b8320d07fd74ceb9c535f6417e5ff62d1"} Dec 05 06:11:44 crc kubenswrapper[4865]: I1205 06:11:44.402360 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-g8vbt" podStartSLOduration=3.402338296 podStartE2EDuration="3.402338296s" podCreationTimestamp="2025-12-05 06:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:44.387927298 +0000 UTC m=+1123.667938520" watchObservedRunningTime="2025-12-05 06:11:44.402338296 +0000 UTC m=+1123.682349518" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:44.446988 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:44.567371 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:44.570532 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:44.578423 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bzcl4"] Dec 05 06:11:45 crc kubenswrapper[4865]: W1205 06:11:44.653285 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaffd9432_ce69_4ef7_8fee_6c8ac08aa659.slice/crio-0f9fec32a8497b6031595cbb002f3ff9934babf86c77a0cf50e2ed334fa7a8bd WatchSource:0}: Error finding container 0f9fec32a8497b6031595cbb002f3ff9934babf86c77a0cf50e2ed334fa7a8bd: Status 404 returned error can't find the container with id 0f9fec32a8497b6031595cbb002f3ff9934babf86c77a0cf50e2ed334fa7a8bd Dec 05 06:11:45 crc kubenswrapper[4865]: W1205 06:11:44.682608 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c79c96f_7385_43dc_8ceb_ac1bf6de7a66.slice/crio-3d6770735365fde5612578a66e65f2704b249d8593b8fc976465edc859faa9b6 WatchSource:0}: Error finding container 3d6770735365fde5612578a66e65f2704b249d8593b8fc976465edc859faa9b6: Status 404 returned error can't find the container with id 3d6770735365fde5612578a66e65f2704b249d8593b8fc976465edc859faa9b6 Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:44.947624 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.049876 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-585c4d6d99-pjvhk"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.072839 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c7f7c44b9-rm82k"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.087957 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c7f7c44b9-rm82k"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.088099 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.117169 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.219024 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-config-data\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.219069 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-266hb\" (UniqueName: \"kubernetes.io/projected/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-kube-api-access-266hb\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.219108 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-scripts\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.219149 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-logs\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.219189 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-horizon-secret-key\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.325878 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-config-data\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.326203 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-266hb\" (UniqueName: \"kubernetes.io/projected/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-kube-api-access-266hb\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.326249 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-scripts\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.326308 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-logs\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.326369 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-horizon-secret-key\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.328061 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-config-data\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.328474 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-scripts\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.328704 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-logs\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.336979 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-horizon-secret-key\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.377347 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-266hb\" (UniqueName: \"kubernetes.io/projected/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-kube-api-access-266hb\") pod \"horizon-7c7f7c44b9-rm82k\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.396495 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66","Type":"ContainerStarted","Data":"3d6770735365fde5612578a66e65f2704b249d8593b8fc976465edc859faa9b6"} Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.400588 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"affd9432-ce69-4ef7-8fee-6c8ac08aa659","Type":"ContainerStarted","Data":"0f9fec32a8497b6031595cbb002f3ff9934babf86c77a0cf50e2ed334fa7a8bd"} Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.410665 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" event={"ID":"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828","Type":"ContainerStarted","Data":"9a3e6a681a3208685c7ecfbd724a67c22142792e36504c325612b65585149a8b"} Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.415207 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69363ce-e52c-4aaf-afae-b45fc7a238c6","Type":"ContainerStarted","Data":"8cf421aa62f3067b16775ee30ab204ddbfb37b61001e91e017ba62053c2e946c"} Dec 05 06:11:45 crc kubenswrapper[4865]: I1205 06:11:45.488944 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.000655 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.118626 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.142903 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-sb\") pod \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.142960 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-config\") pod \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.143043 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf82f\" (UniqueName: \"kubernetes.io/projected/fb46cc5d-4c21-482c-85f5-76e1726d0d99-kube-api-access-kf82f\") pod \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.143082 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-svc\") pod \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.143151 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-swift-storage-0\") pod \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.143179 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-nb\") pod \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\" (UID: \"fb46cc5d-4c21-482c-85f5-76e1726d0d99\") " Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.176176 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb46cc5d-4c21-482c-85f5-76e1726d0d99-kube-api-access-kf82f" (OuterVolumeSpecName: "kube-api-access-kf82f") pod "fb46cc5d-4c21-482c-85f5-76e1726d0d99" (UID: "fb46cc5d-4c21-482c-85f5-76e1726d0d99"). InnerVolumeSpecName "kube-api-access-kf82f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.179629 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb46cc5d-4c21-482c-85f5-76e1726d0d99" (UID: "fb46cc5d-4c21-482c-85f5-76e1726d0d99"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.212401 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb46cc5d-4c21-482c-85f5-76e1726d0d99" (UID: "fb46cc5d-4c21-482c-85f5-76e1726d0d99"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.244646 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf82f\" (UniqueName: \"kubernetes.io/projected/fb46cc5d-4c21-482c-85f5-76e1726d0d99-kube-api-access-kf82f\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.244679 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.244689 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.294422 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb46cc5d-4c21-482c-85f5-76e1726d0d99" (UID: "fb46cc5d-4c21-482c-85f5-76e1726d0d99"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.318715 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-config" (OuterVolumeSpecName: "config") pod "fb46cc5d-4c21-482c-85f5-76e1726d0d99" (UID: "fb46cc5d-4c21-482c-85f5-76e1726d0d99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.328997 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb46cc5d-4c21-482c-85f5-76e1726d0d99" (UID: "fb46cc5d-4c21-482c-85f5-76e1726d0d99"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.349762 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.349802 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.349812 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb46cc5d-4c21-482c-85f5-76e1726d0d99-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.415662 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c7f7c44b9-rm82k"] Dec 05 06:11:46 crc kubenswrapper[4865]: W1205 06:11:46.447641 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26cf35b_7e4e_4bb7_93a7_b16cc18ef9e4.slice/crio-d02361ef8570859292eea00a127d679bfbbdf47435fa1807800b206ec1055326 WatchSource:0}: Error finding container d02361ef8570859292eea00a127d679bfbbdf47435fa1807800b206ec1055326: Status 404 returned error can't find the container with id d02361ef8570859292eea00a127d679bfbbdf47435fa1807800b206ec1055326 Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.448167 4865 generic.go:334] "Generic (PLEG): container finished" podID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerID="4b22ac2b2285bfbb5b6e3827affd91b577617a6e333e8f46de9ae23685a94e0a" exitCode=0 Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.448230 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" event={"ID":"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828","Type":"ContainerDied","Data":"4b22ac2b2285bfbb5b6e3827affd91b577617a6e333e8f46de9ae23685a94e0a"} Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.475011 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" event={"ID":"fb46cc5d-4c21-482c-85f5-76e1726d0d99","Type":"ContainerDied","Data":"c8b55244b2f0fb8ce93d591084c06a86800025aa63417c6781047d3f3f896a92"} Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.475072 4865 scope.go:117] "RemoveContainer" containerID="343046cd9ecb87be0b3a5086d11e16799aeab170bfc0223e0434a4304bfb219d" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.475227 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rfnjt" Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.560462 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rfnjt"] Dec 05 06:11:46 crc kubenswrapper[4865]: I1205 06:11:46.587023 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rfnjt"] Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.034525 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb46cc5d-4c21-482c-85f5-76e1726d0d99" path="/var/lib/kubelet/pods/fb46cc5d-4c21-482c-85f5-76e1726d0d99/volumes" Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.501325 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f7c44b9-rm82k" event={"ID":"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4","Type":"ContainerStarted","Data":"d02361ef8570859292eea00a127d679bfbbdf47435fa1807800b206ec1055326"} Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.505316 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"affd9432-ce69-4ef7-8fee-6c8ac08aa659","Type":"ContainerStarted","Data":"0e5c6174279db9f2c2158509cd1abfdeb6bbf2e625e85d118135e06cfdadfcb0"} Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.516523 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" event={"ID":"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828","Type":"ContainerStarted","Data":"f850146e71caab5731ce1a161a22dbecb87ccd602f550f39d2da944128969cff"} Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.516848 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.524623 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69363ce-e52c-4aaf-afae-b45fc7a238c6","Type":"ContainerStarted","Data":"ce08ed414536da4899f690a5bbc9a15e6b7b380c57c7888b9888d0743aaa184d"} Dec 05 06:11:47 crc kubenswrapper[4865]: I1205 06:11:47.539524 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" podStartSLOduration=5.5395044030000005 podStartE2EDuration="5.539504403s" podCreationTimestamp="2025-12-05 06:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:47.538951918 +0000 UTC m=+1126.818963140" watchObservedRunningTime="2025-12-05 06:11:47.539504403 +0000 UTC m=+1126.819515625" Dec 05 06:11:48 crc kubenswrapper[4865]: I1205 06:11:48.626932 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69363ce-e52c-4aaf-afae-b45fc7a238c6","Type":"ContainerStarted","Data":"1c532670e5385ca841a39773c7282ecf5c12ea065147571421fdf607aafb856b"} Dec 05 06:11:48 crc kubenswrapper[4865]: I1205 06:11:48.627171 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-log" containerID="cri-o://ce08ed414536da4899f690a5bbc9a15e6b7b380c57c7888b9888d0743aaa184d" gracePeriod=30 Dec 05 06:11:48 crc kubenswrapper[4865]: I1205 06:11:48.627242 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-httpd" containerID="cri-o://1c532670e5385ca841a39773c7282ecf5c12ea065147571421fdf607aafb856b" gracePeriod=30 Dec 05 06:11:48 crc kubenswrapper[4865]: I1205 06:11:48.659064 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.659042607 podStartE2EDuration="6.659042607s" podCreationTimestamp="2025-12-05 06:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:48.653527491 +0000 UTC m=+1127.933538713" watchObservedRunningTime="2025-12-05 06:11:48.659042607 +0000 UTC m=+1127.939053829" Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.642218 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"affd9432-ce69-4ef7-8fee-6c8ac08aa659","Type":"ContainerStarted","Data":"10bdf87b31554ce9a676365279211426e289ffda01bd8b3e84d429e1c59be80d"} Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.642913 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-log" containerID="cri-o://0e5c6174279db9f2c2158509cd1abfdeb6bbf2e625e85d118135e06cfdadfcb0" gracePeriod=30 Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.644352 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-httpd" containerID="cri-o://10bdf87b31554ce9a676365279211426e289ffda01bd8b3e84d429e1c59be80d" gracePeriod=30 Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.673427 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.673409385 podStartE2EDuration="7.673409385s" podCreationTimestamp="2025-12-05 06:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:11:49.670884364 +0000 UTC m=+1128.950895596" watchObservedRunningTime="2025-12-05 06:11:49.673409385 +0000 UTC m=+1128.953420607" Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.694476 4865 generic.go:334] "Generic (PLEG): container finished" podID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerID="1c532670e5385ca841a39773c7282ecf5c12ea065147571421fdf607aafb856b" exitCode=143 Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.694514 4865 generic.go:334] "Generic (PLEG): container finished" podID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerID="ce08ed414536da4899f690a5bbc9a15e6b7b380c57c7888b9888d0743aaa184d" exitCode=143 Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.694536 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69363ce-e52c-4aaf-afae-b45fc7a238c6","Type":"ContainerDied","Data":"1c532670e5385ca841a39773c7282ecf5c12ea065147571421fdf607aafb856b"} Dec 05 06:11:49 crc kubenswrapper[4865]: I1205 06:11:49.694567 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69363ce-e52c-4aaf-afae-b45fc7a238c6","Type":"ContainerDied","Data":"ce08ed414536da4899f690a5bbc9a15e6b7b380c57c7888b9888d0743aaa184d"} Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.073021 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181208 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-httpd-run\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181261 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-scripts\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181315 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-internal-tls-certs\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181347 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-combined-ca-bundle\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181431 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181450 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9274t\" (UniqueName: \"kubernetes.io/projected/e69363ce-e52c-4aaf-afae-b45fc7a238c6-kube-api-access-9274t\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181507 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-config-data\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.181564 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-logs\") pod \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\" (UID: \"e69363ce-e52c-4aaf-afae-b45fc7a238c6\") " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.182301 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-logs" (OuterVolumeSpecName: "logs") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.182442 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.189776 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69363ce-e52c-4aaf-afae-b45fc7a238c6-kube-api-access-9274t" (OuterVolumeSpecName: "kube-api-access-9274t") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "kube-api-access-9274t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.213497 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.213598 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-scripts" (OuterVolumeSpecName: "scripts") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.241383 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.283764 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.283796 4865 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69363ce-e52c-4aaf-afae-b45fc7a238c6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.283806 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.283813 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.283851 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.283859 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9274t\" (UniqueName: \"kubernetes.io/projected/e69363ce-e52c-4aaf-afae-b45fc7a238c6-kube-api-access-9274t\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.292766 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-config-data" (OuterVolumeSpecName: "config-data") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.319967 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e69363ce-e52c-4aaf-afae-b45fc7a238c6" (UID: "e69363ce-e52c-4aaf-afae-b45fc7a238c6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.321027 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.385584 4865 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.385612 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.385622 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69363ce-e52c-4aaf-afae-b45fc7a238c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.705652 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64949fcd59-km7qf"] Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.758345 4865 generic.go:334] "Generic (PLEG): container finished" podID="774b2338-1b59-4df3-ac6f-939724c29231" containerID="7815231a3951ecc27a948b95402d899cce2a6aa10c5addd5981f6d55a4c1efe9" exitCode=0 Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.758458 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mczg5" event={"ID":"774b2338-1b59-4df3-ac6f-939724c29231","Type":"ContainerDied","Data":"7815231a3951ecc27a948b95402d899cce2a6aa10c5addd5981f6d55a4c1efe9"} Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775320 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bd68dd9b8-z62zt"] Dec 05 06:11:50 crc kubenswrapper[4865]: E1205 06:11:50.775707 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-log" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775720 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-log" Dec 05 06:11:50 crc kubenswrapper[4865]: E1205 06:11:50.775745 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb46cc5d-4c21-482c-85f5-76e1726d0d99" containerName="init" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775752 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb46cc5d-4c21-482c-85f5-76e1726d0d99" containerName="init" Dec 05 06:11:50 crc kubenswrapper[4865]: E1205 06:11:50.775763 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-httpd" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775769 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-httpd" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775952 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb46cc5d-4c21-482c-85f5-76e1726d0d99" containerName="init" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775966 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-httpd" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.775979 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" containerName="glance-log" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.776981 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.783792 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.790503 4865 generic.go:334] "Generic (PLEG): container finished" podID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerID="10bdf87b31554ce9a676365279211426e289ffda01bd8b3e84d429e1c59be80d" exitCode=0 Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.790534 4865 generic.go:334] "Generic (PLEG): container finished" podID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerID="0e5c6174279db9f2c2158509cd1abfdeb6bbf2e625e85d118135e06cfdadfcb0" exitCode=143 Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.790579 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"affd9432-ce69-4ef7-8fee-6c8ac08aa659","Type":"ContainerDied","Data":"10bdf87b31554ce9a676365279211426e289ffda01bd8b3e84d429e1c59be80d"} Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.790609 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"affd9432-ce69-4ef7-8fee-6c8ac08aa659","Type":"ContainerDied","Data":"0e5c6174279db9f2c2158509cd1abfdeb6bbf2e625e85d118135e06cfdadfcb0"} Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.840856 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd68dd9b8-z62zt"] Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.844156 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69363ce-e52c-4aaf-afae-b45fc7a238c6","Type":"ContainerDied","Data":"8cf421aa62f3067b16775ee30ab204ddbfb37b61001e91e017ba62053c2e946c"} Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.844216 4865 scope.go:117] "RemoveContainer" containerID="1c532670e5385ca841a39773c7282ecf5c12ea065147571421fdf607aafb856b" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.844411 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.902432 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-combined-ca-bundle\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.902696 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-secret-key\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.902777 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-config-data\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.902868 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-scripts\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.902987 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-logs\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.903046 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-tls-certs\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.903213 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgw5w\" (UniqueName: \"kubernetes.io/projected/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-kube-api-access-vgw5w\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.906949 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c7f7c44b9-rm82k"] Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.946928 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78c59b79fd-5jlv4"] Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.969419 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:50 crc kubenswrapper[4865]: I1205 06:11:50.999266 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.017620 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-logs\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.017693 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-tls-certs\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.018484 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-config-data\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.019948 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgw5w\" (UniqueName: \"kubernetes.io/projected/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-kube-api-access-vgw5w\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.020033 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-combined-ca-bundle\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.023735 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-secret-key\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.023791 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4wdk\" (UniqueName: \"kubernetes.io/projected/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-kube-api-access-d4wdk\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024078 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-config-data\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024107 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-scripts\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024161 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-scripts\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024199 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-logs\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024255 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-horizon-tls-certs\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024323 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-combined-ca-bundle\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.024370 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-horizon-secret-key\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.035430 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-logs\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.038695 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-config-data\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.044677 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-scripts\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.066529 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-tls-certs\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.078776 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgw5w\" (UniqueName: \"kubernetes.io/projected/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-kube-api-access-vgw5w\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.094540 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-secret-key\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.097208 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-combined-ca-bundle\") pod \"horizon-bd68dd9b8-z62zt\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.116875 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.116917 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78c59b79fd-5jlv4"] Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.116933 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.119622 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.123567 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.125285 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.125330 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126700 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4wdk\" (UniqueName: \"kubernetes.io/projected/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-kube-api-access-d4wdk\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126737 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-scripts\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126770 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-logs\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126798 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-horizon-tls-certs\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126842 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-combined-ca-bundle\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126869 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-horizon-secret-key\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.126892 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-config-data\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.128273 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-logs\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.132135 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-scripts\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.136004 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-combined-ca-bundle\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.137324 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-horizon-tls-certs\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.137893 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-horizon-secret-key\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.143702 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-config-data\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.154606 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.184891 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.227942 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4wdk\" (UniqueName: \"kubernetes.io/projected/0b2dbfc6-6978-4613-a307-d4d4b4b88bc9-kube-api-access-d4wdk\") pod \"horizon-78c59b79fd-5jlv4\" (UID: \"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9\") " pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: E1205 06:11:51.280483 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-fzdlw logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="11a2ab48-a0e8-4017-b685-a00658242d6f" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.297498 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330653 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330699 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330724 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330748 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330780 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330850 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-logs\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330870 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.330907 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdlw\" (UniqueName: \"kubernetes.io/projected/11a2ab48-a0e8-4017-b685-a00658242d6f-kube-api-access-fzdlw\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432331 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432429 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-logs\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432457 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432498 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdlw\" (UniqueName: \"kubernetes.io/projected/11a2ab48-a0e8-4017-b685-a00658242d6f-kube-api-access-fzdlw\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432554 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432573 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432595 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.432612 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.436338 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.437023 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-logs\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.437262 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.438906 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.463502 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.463867 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.463888 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.467336 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdlw\" (UniqueName: \"kubernetes.io/projected/11a2ab48-a0e8-4017-b685-a00658242d6f-kube-api-access-fzdlw\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.486857 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.872632 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:51 crc kubenswrapper[4865]: I1205 06:11:51.900222 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.045682 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.045772 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-scripts\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.045866 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-combined-ca-bundle\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.045926 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-logs\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.045958 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-httpd-run\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.045978 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzdlw\" (UniqueName: \"kubernetes.io/projected/11a2ab48-a0e8-4017-b685-a00658242d6f-kube-api-access-fzdlw\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.046028 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-config-data\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.046069 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-internal-tls-certs\") pod \"11a2ab48-a0e8-4017-b685-a00658242d6f\" (UID: \"11a2ab48-a0e8-4017-b685-a00658242d6f\") " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.048513 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-logs" (OuterVolumeSpecName: "logs") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.051340 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.055007 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-scripts" (OuterVolumeSpecName: "scripts") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.055094 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.057492 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.059365 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.064212 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a2ab48-a0e8-4017-b685-a00658242d6f-kube-api-access-fzdlw" (OuterVolumeSpecName: "kube-api-access-fzdlw") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "kube-api-access-fzdlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.118868 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-config-data" (OuterVolumeSpecName: "config-data") pod "11a2ab48-a0e8-4017-b685-a00658242d6f" (UID: "11a2ab48-a0e8-4017-b685-a00658242d6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151229 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151267 4865 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11a2ab48-a0e8-4017-b685-a00658242d6f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151280 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzdlw\" (UniqueName: \"kubernetes.io/projected/11a2ab48-a0e8-4017-b685-a00658242d6f-kube-api-access-fzdlw\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151292 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151305 4865 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151331 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151341 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.151352 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a2ab48-a0e8-4017-b685-a00658242d6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.178325 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.253497 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.886757 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.971023 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.978255 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.986423 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.988117 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.992909 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 06:11:52 crc kubenswrapper[4865]: I1205 06:11:52.996763 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.071848 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a2ab48-a0e8-4017-b685-a00658242d6f" path="/var/lib/kubelet/pods/11a2ab48-a0e8-4017-b685-a00658242d6f/volumes" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.072879 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69363ce-e52c-4aaf-afae-b45fc7a238c6" path="/var/lib/kubelet/pods/e69363ce-e52c-4aaf-afae-b45fc7a238c6/volumes" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074006 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074075 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074119 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074143 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074164 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkzq\" (UniqueName: \"kubernetes.io/projected/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-kube-api-access-2pkzq\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074204 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074269 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.074292 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.075221 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.170982 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176587 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176642 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176725 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176763 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176811 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176845 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176862 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkzq\" (UniqueName: \"kubernetes.io/projected/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-kube-api-access-2pkzq\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.176901 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.177270 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.180458 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.180898 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.187664 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.187848 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.188936 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.202794 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.216805 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkzq\" (UniqueName: \"kubernetes.io/projected/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-kube-api-access-2pkzq\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.237330 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.273878 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-kf7z9"] Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.274096 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="dnsmasq-dns" containerID="cri-o://72ef5c16fd4daa3dfa79816d71e20a8833a8242effb872e1a5714a6b250c442c" gracePeriod=10 Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.318513 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.899152 4865 generic.go:334] "Generic (PLEG): container finished" podID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerID="72ef5c16fd4daa3dfa79816d71e20a8833a8242effb872e1a5714a6b250c442c" exitCode=0 Dec 05 06:11:53 crc kubenswrapper[4865]: I1205 06:11:53.899210 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" event={"ID":"6600be64-7e1e-4d51-a8df-8cb630b9af81","Type":"ContainerDied","Data":"72ef5c16fd4daa3dfa79816d71e20a8833a8242effb872e1a5714a6b250c442c"} Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.725631 4865 scope.go:117] "RemoveContainer" containerID="ce08ed414536da4899f690a5bbc9a15e6b7b380c57c7888b9888d0743aaa184d" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.813290 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.821245 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.923404 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"affd9432-ce69-4ef7-8fee-6c8ac08aa659","Type":"ContainerDied","Data":"0f9fec32a8497b6031595cbb002f3ff9934babf86c77a0cf50e2ed334fa7a8bd"} Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.923490 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.929608 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mczg5" event={"ID":"774b2338-1b59-4df3-ac6f-939724c29231","Type":"ContainerDied","Data":"aa0ad94227755e8d5a290720f6be9a7528f2c06249d6e78a16548919ee454de4"} Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.929638 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa0ad94227755e8d5a290720f6be9a7528f2c06249d6e78a16548919ee454de4" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.929683 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mczg5" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938372 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-scripts\") pod \"774b2338-1b59-4df3-ac6f-939724c29231\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938482 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-logs\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938551 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-combined-ca-bundle\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938585 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s86qq\" (UniqueName: \"kubernetes.io/projected/affd9432-ce69-4ef7-8fee-6c8ac08aa659-kube-api-access-s86qq\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938613 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-fernet-keys\") pod \"774b2338-1b59-4df3-ac6f-939724c29231\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938652 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-public-tls-certs\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938683 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-credential-keys\") pod \"774b2338-1b59-4df3-ac6f-939724c29231\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938713 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-httpd-run\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938777 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv64f\" (UniqueName: \"kubernetes.io/projected/774b2338-1b59-4df3-ac6f-939724c29231-kube-api-access-fv64f\") pod \"774b2338-1b59-4df3-ac6f-939724c29231\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938802 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-config-data\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938847 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-combined-ca-bundle\") pod \"774b2338-1b59-4df3-ac6f-939724c29231\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938868 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-config-data\") pod \"774b2338-1b59-4df3-ac6f-939724c29231\" (UID: \"774b2338-1b59-4df3-ac6f-939724c29231\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938901 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.938944 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-scripts\") pod \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\" (UID: \"affd9432-ce69-4ef7-8fee-6c8ac08aa659\") " Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.939153 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-logs" (OuterVolumeSpecName: "logs") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.939470 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.939873 4865 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.939898 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/affd9432-ce69-4ef7-8fee-6c8ac08aa659-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.955435 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/affd9432-ce69-4ef7-8fee-6c8ac08aa659-kube-api-access-s86qq" (OuterVolumeSpecName: "kube-api-access-s86qq") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "kube-api-access-s86qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.959925 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "774b2338-1b59-4df3-ac6f-939724c29231" (UID: "774b2338-1b59-4df3-ac6f-939724c29231"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.960769 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-scripts" (OuterVolumeSpecName: "scripts") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.960778 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-scripts" (OuterVolumeSpecName: "scripts") pod "774b2338-1b59-4df3-ac6f-939724c29231" (UID: "774b2338-1b59-4df3-ac6f-939724c29231"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.960954 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "774b2338-1b59-4df3-ac6f-939724c29231" (UID: "774b2338-1b59-4df3-ac6f-939724c29231"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.963513 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.964892 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774b2338-1b59-4df3-ac6f-939724c29231-kube-api-access-fv64f" (OuterVolumeSpecName: "kube-api-access-fv64f") pod "774b2338-1b59-4df3-ac6f-939724c29231" (UID: "774b2338-1b59-4df3-ac6f-939724c29231"). InnerVolumeSpecName "kube-api-access-fv64f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:11:55 crc kubenswrapper[4865]: I1205 06:11:55.991680 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.009807 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-config-data" (OuterVolumeSpecName: "config-data") pod "774b2338-1b59-4df3-ac6f-939724c29231" (UID: "774b2338-1b59-4df3-ac6f-939724c29231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.023022 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.027835 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "774b2338-1b59-4df3-ac6f-939724c29231" (UID: "774b2338-1b59-4df3-ac6f-939724c29231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.036013 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-config-data" (OuterVolumeSpecName: "config-data") pod "affd9432-ce69-4ef7-8fee-6c8ac08aa659" (UID: "affd9432-ce69-4ef7-8fee-6c8ac08aa659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042498 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042539 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042553 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s86qq\" (UniqueName: \"kubernetes.io/projected/affd9432-ce69-4ef7-8fee-6c8ac08aa659-kube-api-access-s86qq\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042564 4865 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042576 4865 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042586 4865 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042598 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv64f\" (UniqueName: \"kubernetes.io/projected/774b2338-1b59-4df3-ac6f-939724c29231-kube-api-access-fv64f\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042609 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042653 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042668 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774b2338-1b59-4df3-ac6f-939724c29231-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042714 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.042729 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/affd9432-ce69-4ef7-8fee-6c8ac08aa659-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.065281 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.148525 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.265011 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.275501 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.291972 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:56 crc kubenswrapper[4865]: E1205 06:11:56.292312 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-httpd" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.292328 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-httpd" Dec 05 06:11:56 crc kubenswrapper[4865]: E1205 06:11:56.292341 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774b2338-1b59-4df3-ac6f-939724c29231" containerName="keystone-bootstrap" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.292348 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="774b2338-1b59-4df3-ac6f-939724c29231" containerName="keystone-bootstrap" Dec 05 06:11:56 crc kubenswrapper[4865]: E1205 06:11:56.292363 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-log" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.292368 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-log" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.292531 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="774b2338-1b59-4df3-ac6f-939724c29231" containerName="keystone-bootstrap" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.292546 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-log" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.292564 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" containerName="glance-httpd" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.293485 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.296072 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.296131 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.320842 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.453922 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.453975 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-logs\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.454023 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.454050 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.454098 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27k2w\" (UniqueName: \"kubernetes.io/projected/dc8877b8-5bcf-45b4-b224-755711b47627-kube-api-access-27k2w\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.454125 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.454185 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.454250 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556394 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556504 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556575 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556601 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-logs\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556637 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556661 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556704 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27k2w\" (UniqueName: \"kubernetes.io/projected/dc8877b8-5bcf-45b4-b224-755711b47627-kube-api-access-27k2w\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.556730 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.559463 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-logs\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.559892 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.560132 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.563660 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.565100 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.566964 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.567158 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.578231 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27k2w\" (UniqueName: \"kubernetes.io/projected/dc8877b8-5bcf-45b4-b224-755711b47627-kube-api-access-27k2w\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.596767 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.612890 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.945448 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mczg5"] Dec 05 06:11:56 crc kubenswrapper[4865]: I1205 06:11:56.953786 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mczg5"] Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.030911 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774b2338-1b59-4df3-ac6f-939724c29231" path="/var/lib/kubelet/pods/774b2338-1b59-4df3-ac6f-939724c29231/volumes" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.031679 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="affd9432-ce69-4ef7-8fee-6c8ac08aa659" path="/var/lib/kubelet/pods/affd9432-ce69-4ef7-8fee-6c8ac08aa659/volumes" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.032294 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lcs2x"] Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.034778 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.035515 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lcs2x"] Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.037837 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.038222 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.038335 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.038452 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.038594 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nq2gr" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.174535 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-combined-ca-bundle\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.174579 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-config-data\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.174606 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-scripts\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.174715 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lvml\" (UniqueName: \"kubernetes.io/projected/96bbdbcb-6f86-41ff-99bc-1af813144fd4-kube-api-access-2lvml\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.174764 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-credential-keys\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.174797 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-fernet-keys\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.276660 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lvml\" (UniqueName: \"kubernetes.io/projected/96bbdbcb-6f86-41ff-99bc-1af813144fd4-kube-api-access-2lvml\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.276746 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-credential-keys\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.276803 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-fernet-keys\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.276949 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-combined-ca-bundle\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.276975 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-config-data\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.277000 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-scripts\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.284656 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-combined-ca-bundle\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.284928 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-config-data\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.285271 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-fernet-keys\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.290880 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-credential-keys\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.291300 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-scripts\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.292754 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lvml\" (UniqueName: \"kubernetes.io/projected/96bbdbcb-6f86-41ff-99bc-1af813144fd4-kube-api-access-2lvml\") pod \"keystone-bootstrap-lcs2x\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:11:57 crc kubenswrapper[4865]: I1205 06:11:57.397945 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:12:02 crc kubenswrapper[4865]: I1205 06:12:02.342504 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 06:12:04 crc kubenswrapper[4865]: E1205 06:12:04.644724 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 06:12:04 crc kubenswrapper[4865]: E1205 06:12:04.645252 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d6h55ch599h58bh54ch5fh648h665hbbh586hdch654hf8h97h578hfbh554hd8hcbh554h675h6ch5bdh55fh68dh699h54dh64fh78h78h679h595q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljwbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-64949fcd59-km7qf_openstack(f25f8ba1-4e54-4919-9fdf-b7d7abf18572): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:12:04 crc kubenswrapper[4865]: E1205 06:12:04.648712 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-64949fcd59-km7qf" podUID="f25f8ba1-4e54-4919-9fdf-b7d7abf18572" Dec 05 06:12:04 crc kubenswrapper[4865]: E1205 06:12:04.651172 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 06:12:04 crc kubenswrapper[4865]: E1205 06:12:04.651321 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64hbbh67h98h5b8h594h65fh5f9h644h55ch5f9h594h584hc7h67ch5dbh8bh9fh644hd4hfch74h545h677h5bdh88h64h697h667h646h56bhdbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-266hb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7c7f7c44b9-rm82k_openstack(b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:12:04 crc kubenswrapper[4865]: E1205 06:12:04.654095 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7c7f7c44b9-rm82k" podUID="b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" Dec 05 06:12:06 crc kubenswrapper[4865]: E1205 06:12:06.405072 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 06:12:06 crc kubenswrapper[4865]: E1205 06:12:06.405435 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h5f6h5b4h77h64chbch66ch644hbdh547h67dhf7h5c4h555h57fh58dh655h544hdfhf5hfh5f5h558h54bh544h76hf8h67h567h586h54dh667q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7hs9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-585c4d6d99-pjvhk_openstack(698c990e-e8b5-4c1b-8177-4e44874f4a44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:12:06 crc kubenswrapper[4865]: E1205 06:12:06.411018 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-585c4d6d99-pjvhk" podUID="698c990e-e8b5-4c1b-8177-4e44874f4a44" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.513416 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.577738 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.577883 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-nb\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.577952 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-svc\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.577986 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-config\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.578042 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/6600be64-7e1e-4d51-a8df-8cb630b9af81-kube-api-access-gwmsg\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.578099 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-swift-storage-0\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.603448 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6600be64-7e1e-4d51-a8df-8cb630b9af81-kube-api-access-gwmsg" (OuterVolumeSpecName: "kube-api-access-gwmsg") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "kube-api-access-gwmsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.664239 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-config" (OuterVolumeSpecName: "config") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.673456 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.679139 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.679299 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb\") pod \"6600be64-7e1e-4d51-a8df-8cb630b9af81\" (UID: \"6600be64-7e1e-4d51-a8df-8cb630b9af81\") " Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.679885 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.679903 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.679914 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/6600be64-7e1e-4d51-a8df-8cb630b9af81-kube-api-access-gwmsg\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:06 crc kubenswrapper[4865]: W1205 06:12:06.679972 4865 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6600be64-7e1e-4d51-a8df-8cb630b9af81/volumes/kubernetes.io~configmap/ovsdbserver-sb Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.679980 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.689504 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.705962 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6600be64-7e1e-4d51-a8df-8cb630b9af81" (UID: "6600be64-7e1e-4d51-a8df-8cb630b9af81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.782011 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.782048 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:06 crc kubenswrapper[4865]: I1205 06:12:06.782061 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6600be64-7e1e-4d51-a8df-8cb630b9af81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:07 crc kubenswrapper[4865]: I1205 06:12:07.038459 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" Dec 05 06:12:07 crc kubenswrapper[4865]: I1205 06:12:07.038444 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" event={"ID":"6600be64-7e1e-4d51-a8df-8cb630b9af81","Type":"ContainerDied","Data":"5d2d561efbc73a3857b22e21170d03c40d30f3908ae7e92df277b833f574c07a"} Dec 05 06:12:07 crc kubenswrapper[4865]: I1205 06:12:07.099845 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-kf7z9"] Dec 05 06:12:07 crc kubenswrapper[4865]: I1205 06:12:07.109666 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-kf7z9"] Dec 05 06:12:07 crc kubenswrapper[4865]: E1205 06:12:07.245943 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 05 06:12:07 crc kubenswrapper[4865]: E1205 06:12:07.246326 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5dd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mn58x_openstack(1de4159c-2d90-4b3a-bcff-84f293a59c35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:12:07 crc kubenswrapper[4865]: E1205 06:12:07.247683 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mn58x" podUID="1de4159c-2d90-4b3a-bcff-84f293a59c35" Dec 05 06:12:07 crc kubenswrapper[4865]: I1205 06:12:07.343983 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-kf7z9" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 06:12:08 crc kubenswrapper[4865]: E1205 06:12:08.049463 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mn58x" podUID="1de4159c-2d90-4b3a-bcff-84f293a59c35" Dec 05 06:12:09 crc kubenswrapper[4865]: I1205 06:12:09.021693 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" path="/var/lib/kubelet/pods/6600be64-7e1e-4d51-a8df-8cb630b9af81/volumes" Dec 05 06:12:09 crc kubenswrapper[4865]: I1205 06:12:09.059324 4865 generic.go:334] "Generic (PLEG): container finished" podID="4021310b-c06b-44b3-9c95-7ca10552da10" containerID="56ab844b39f7f229e643ac1e2e555f59ffc130412970cd5a6d57b22a5128c88c" exitCode=0 Dec 05 06:12:09 crc kubenswrapper[4865]: I1205 06:12:09.060482 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8vbt" event={"ID":"4021310b-c06b-44b3-9c95-7ca10552da10","Type":"ContainerDied","Data":"56ab844b39f7f229e643ac1e2e555f59ffc130412970cd5a6d57b22a5128c88c"} Dec 05 06:12:11 crc kubenswrapper[4865]: I1205 06:12:11.049245 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:12:11 crc kubenswrapper[4865]: I1205 06:12:11.049675 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:12:11 crc kubenswrapper[4865]: I1205 06:12:11.049724 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:12:11 crc kubenswrapper[4865]: I1205 06:12:11.050621 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20536395e22903e5ca8dee5d63c34f131b2da1d0f9f86ec93b930a0c9e072342"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:12:11 crc kubenswrapper[4865]: I1205 06:12:11.050680 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://20536395e22903e5ca8dee5d63c34f131b2da1d0f9f86ec93b930a0c9e072342" gracePeriod=600 Dec 05 06:12:12 crc kubenswrapper[4865]: I1205 06:12:12.105429 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="20536395e22903e5ca8dee5d63c34f131b2da1d0f9f86ec93b930a0c9e072342" exitCode=0 Dec 05 06:12:12 crc kubenswrapper[4865]: I1205 06:12:12.105851 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"20536395e22903e5ca8dee5d63c34f131b2da1d0f9f86ec93b930a0c9e072342"} Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.152602 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.159196 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.204912 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f7c44b9-rm82k" event={"ID":"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4","Type":"ContainerDied","Data":"d02361ef8570859292eea00a127d679bfbbdf47435fa1807800b206ec1055326"} Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.204962 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f7c44b9-rm82k" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.207099 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64949fcd59-km7qf" event={"ID":"f25f8ba1-4e54-4919-9fdf-b7d7abf18572","Type":"ContainerDied","Data":"286a1ad523bb32f7076815da56602668e9d663b5de7f8c90a5778a341f44bcf0"} Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.207183 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64949fcd59-km7qf" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263119 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-logs\") pod \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263219 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-logs\") pod \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263241 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-horizon-secret-key\") pod \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263301 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-266hb\" (UniqueName: \"kubernetes.io/projected/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-kube-api-access-266hb\") pod \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263345 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljwbq\" (UniqueName: \"kubernetes.io/projected/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-kube-api-access-ljwbq\") pod \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263365 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-scripts\") pod \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263427 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-config-data\") pod \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263445 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-scripts\") pod \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\" (UID: \"f25f8ba1-4e54-4919-9fdf-b7d7abf18572\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263503 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-horizon-secret-key\") pod \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263536 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-config-data\") pod \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\" (UID: \"b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263538 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-logs" (OuterVolumeSpecName: "logs") pod "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" (UID: "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.263912 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.264329 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-scripts" (OuterVolumeSpecName: "scripts") pod "f25f8ba1-4e54-4919-9fdf-b7d7abf18572" (UID: "f25f8ba1-4e54-4919-9fdf-b7d7abf18572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.264372 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-scripts" (OuterVolumeSpecName: "scripts") pod "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" (UID: "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.264589 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-logs" (OuterVolumeSpecName: "logs") pod "f25f8ba1-4e54-4919-9fdf-b7d7abf18572" (UID: "f25f8ba1-4e54-4919-9fdf-b7d7abf18572"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.264950 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-config-data" (OuterVolumeSpecName: "config-data") pod "f25f8ba1-4e54-4919-9fdf-b7d7abf18572" (UID: "f25f8ba1-4e54-4919-9fdf-b7d7abf18572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.264988 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-config-data" (OuterVolumeSpecName: "config-data") pod "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" (UID: "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.281795 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f25f8ba1-4e54-4919-9fdf-b7d7abf18572" (UID: "f25f8ba1-4e54-4919-9fdf-b7d7abf18572"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.281879 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-kube-api-access-ljwbq" (OuterVolumeSpecName: "kube-api-access-ljwbq") pod "f25f8ba1-4e54-4919-9fdf-b7d7abf18572" (UID: "f25f8ba1-4e54-4919-9fdf-b7d7abf18572"). InnerVolumeSpecName "kube-api-access-ljwbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.282605 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-kube-api-access-266hb" (OuterVolumeSpecName: "kube-api-access-266hb") pod "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" (UID: "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4"). InnerVolumeSpecName "kube-api-access-266hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.290159 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" (UID: "b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365540 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365572 4865 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365586 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-266hb\" (UniqueName: \"kubernetes.io/projected/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-kube-api-access-266hb\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365594 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljwbq\" (UniqueName: \"kubernetes.io/projected/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-kube-api-access-ljwbq\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365603 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365612 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365621 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25f8ba1-4e54-4919-9fdf-b7d7abf18572-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365628 4865 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.365636 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.583177 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64949fcd59-km7qf"] Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.587382 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64949fcd59-km7qf"] Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.635615 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c7f7c44b9-rm82k"] Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.648311 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c7f7c44b9-rm82k"] Dec 05 06:12:20 crc kubenswrapper[4865]: E1205 06:12:20.708374 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26cf35b_7e4e_4bb7_93a7_b16cc18ef9e4.slice/crio-d02361ef8570859292eea00a127d679bfbbdf47435fa1807800b206ec1055326\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26cf35b_7e4e_4bb7_93a7_b16cc18ef9e4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25f8ba1_4e54_4919_9fdf_b7d7abf18572.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25f8ba1_4e54_4919_9fdf_b7d7abf18572.slice/crio-286a1ad523bb32f7076815da56602668e9d663b5de7f8c90a5778a341f44bcf0\": RecentStats: unable to find data in memory cache]" Dec 05 06:12:20 crc kubenswrapper[4865]: E1205 06:12:20.755372 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 05 06:12:20 crc kubenswrapper[4865]: E1205 06:12:20.755591 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d4h5dh5ffh5dh4h75h5fbh594h564h66ch649hdch5ffh87hdfh79hdbh56h88h65ch65ch65bh59hd4h569h684hd4h54h674h676h9fh5cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn2st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1c79c96f-7385-43dc-8ceb-ac1bf6de7a66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.816708 4865 scope.go:117] "RemoveContainer" containerID="10bdf87b31554ce9a676365279211426e289ffda01bd8b3e84d429e1c59be80d" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.883497 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.889212 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984193 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-config\") pod \"4021310b-c06b-44b3-9c95-7ca10552da10\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984469 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/698c990e-e8b5-4c1b-8177-4e44874f4a44-horizon-secret-key\") pod \"698c990e-e8b5-4c1b-8177-4e44874f4a44\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984504 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-config-data\") pod \"698c990e-e8b5-4c1b-8177-4e44874f4a44\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984539 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qftn\" (UniqueName: \"kubernetes.io/projected/4021310b-c06b-44b3-9c95-7ca10552da10-kube-api-access-2qftn\") pod \"4021310b-c06b-44b3-9c95-7ca10552da10\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984600 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-combined-ca-bundle\") pod \"4021310b-c06b-44b3-9c95-7ca10552da10\" (UID: \"4021310b-c06b-44b3-9c95-7ca10552da10\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984627 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7hs9\" (UniqueName: \"kubernetes.io/projected/698c990e-e8b5-4c1b-8177-4e44874f4a44-kube-api-access-t7hs9\") pod \"698c990e-e8b5-4c1b-8177-4e44874f4a44\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984685 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-scripts\") pod \"698c990e-e8b5-4c1b-8177-4e44874f4a44\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.984781 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698c990e-e8b5-4c1b-8177-4e44874f4a44-logs\") pod \"698c990e-e8b5-4c1b-8177-4e44874f4a44\" (UID: \"698c990e-e8b5-4c1b-8177-4e44874f4a44\") " Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.985409 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698c990e-e8b5-4c1b-8177-4e44874f4a44-logs" (OuterVolumeSpecName: "logs") pod "698c990e-e8b5-4c1b-8177-4e44874f4a44" (UID: "698c990e-e8b5-4c1b-8177-4e44874f4a44"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.985329 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-config-data" (OuterVolumeSpecName: "config-data") pod "698c990e-e8b5-4c1b-8177-4e44874f4a44" (UID: "698c990e-e8b5-4c1b-8177-4e44874f4a44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.985928 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-scripts" (OuterVolumeSpecName: "scripts") pod "698c990e-e8b5-4c1b-8177-4e44874f4a44" (UID: "698c990e-e8b5-4c1b-8177-4e44874f4a44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.990787 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698c990e-e8b5-4c1b-8177-4e44874f4a44-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "698c990e-e8b5-4c1b-8177-4e44874f4a44" (UID: "698c990e-e8b5-4c1b-8177-4e44874f4a44"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.990794 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4021310b-c06b-44b3-9c95-7ca10552da10-kube-api-access-2qftn" (OuterVolumeSpecName: "kube-api-access-2qftn") pod "4021310b-c06b-44b3-9c95-7ca10552da10" (UID: "4021310b-c06b-44b3-9c95-7ca10552da10"). InnerVolumeSpecName "kube-api-access-2qftn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:20 crc kubenswrapper[4865]: I1205 06:12:20.996028 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698c990e-e8b5-4c1b-8177-4e44874f4a44-kube-api-access-t7hs9" (OuterVolumeSpecName: "kube-api-access-t7hs9") pod "698c990e-e8b5-4c1b-8177-4e44874f4a44" (UID: "698c990e-e8b5-4c1b-8177-4e44874f4a44"). InnerVolumeSpecName "kube-api-access-t7hs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.012718 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-config" (OuterVolumeSpecName: "config") pod "4021310b-c06b-44b3-9c95-7ca10552da10" (UID: "4021310b-c06b-44b3-9c95-7ca10552da10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.017918 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4021310b-c06b-44b3-9c95-7ca10552da10" (UID: "4021310b-c06b-44b3-9c95-7ca10552da10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.021897 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4" path="/var/lib/kubelet/pods/b26cf35b-7e4e-4bb7-93a7-b16cc18ef9e4/volumes" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.022375 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25f8ba1-4e54-4919-9fdf-b7d7abf18572" path="/var/lib/kubelet/pods/f25f8ba1-4e54-4919-9fdf-b7d7abf18572/volumes" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086744 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698c990e-e8b5-4c1b-8177-4e44874f4a44-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086777 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086787 4865 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/698c990e-e8b5-4c1b-8177-4e44874f4a44-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086799 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086810 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qftn\" (UniqueName: \"kubernetes.io/projected/4021310b-c06b-44b3-9c95-7ca10552da10-kube-api-access-2qftn\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086833 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021310b-c06b-44b3-9c95-7ca10552da10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086841 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7hs9\" (UniqueName: \"kubernetes.io/projected/698c990e-e8b5-4c1b-8177-4e44874f4a44-kube-api-access-t7hs9\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.086849 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698c990e-e8b5-4c1b-8177-4e44874f4a44-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.220102 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd68dd9b8-z62zt"] Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.221707 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8vbt" event={"ID":"4021310b-c06b-44b3-9c95-7ca10552da10","Type":"ContainerDied","Data":"46df351c6eda26cd6c8cb415679bc2a9cd577421125c27052d20d4024f3ede35"} Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.221737 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46df351c6eda26cd6c8cb415679bc2a9cd577421125c27052d20d4024f3ede35" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.221716 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8vbt" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.224315 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-585c4d6d99-pjvhk" event={"ID":"698c990e-e8b5-4c1b-8177-4e44874f4a44","Type":"ContainerDied","Data":"d69f591385a79cea2dff5928f85ae63668916d9c66f754f753f5abec1e359335"} Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.224578 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-585c4d6d99-pjvhk" Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.274080 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-585c4d6d99-pjvhk"] Dec 05 06:12:21 crc kubenswrapper[4865]: I1205 06:12:21.281306 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-585c4d6d99-pjvhk"] Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.181431 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mfzbq"] Dec 05 06:12:22 crc kubenswrapper[4865]: E1205 06:12:22.182050 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="init" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.182075 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="init" Dec 05 06:12:22 crc kubenswrapper[4865]: E1205 06:12:22.182095 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="dnsmasq-dns" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.182102 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="dnsmasq-dns" Dec 05 06:12:22 crc kubenswrapper[4865]: E1205 06:12:22.182125 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4021310b-c06b-44b3-9c95-7ca10552da10" containerName="neutron-db-sync" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.182130 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4021310b-c06b-44b3-9c95-7ca10552da10" containerName="neutron-db-sync" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.182302 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="6600be64-7e1e-4d51-a8df-8cb630b9af81" containerName="dnsmasq-dns" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.182325 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4021310b-c06b-44b3-9c95-7ca10552da10" containerName="neutron-db-sync" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.183223 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.212488 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkqbh\" (UniqueName: \"kubernetes.io/projected/62b75a9f-3535-47bf-8874-6ef496fc894d-kube-api-access-qkqbh\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.212581 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.212637 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.212722 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-config\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.212755 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.212818 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.222740 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mfzbq"] Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.315862 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-config\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.315917 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.315971 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.315995 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkqbh\" (UniqueName: \"kubernetes.io/projected/62b75a9f-3535-47bf-8874-6ef496fc894d-kube-api-access-qkqbh\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.316037 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.316086 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.317772 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.318314 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-config\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.320071 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.320560 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.324160 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.387899 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkqbh\" (UniqueName: \"kubernetes.io/projected/62b75a9f-3535-47bf-8874-6ef496fc894d-kube-api-access-qkqbh\") pod \"dnsmasq-dns-5ccc5c4795-mfzbq\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.508451 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.522390 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f79fcff88-45kgr"] Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.535908 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.541380 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.541662 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.541924 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.542421 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d7f7p" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.555764 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f79fcff88-45kgr"] Dec 05 06:12:22 crc kubenswrapper[4865]: E1205 06:12:22.607660 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 05 06:12:22 crc kubenswrapper[4865]: E1205 06:12:22.607837 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kh5bv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f7d5b_openstack(b5e4dce7-c9e7-4813-a957-1df502644792): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:12:22 crc kubenswrapper[4865]: E1205 06:12:22.609696 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f7d5b" podUID="b5e4dce7-c9e7-4813-a957-1df502644792" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.640312 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmg9\" (UniqueName: \"kubernetes.io/projected/01721db4-0a32-46e7-a617-4f7369599b6e-kube-api-access-5kmg9\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.640421 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-combined-ca-bundle\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.640612 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-config\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.640642 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-httpd-config\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.640680 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-ovndb-tls-certs\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: W1205 06:12:22.702946 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca38ca20_0d35_4058_b0f6_bbe4251c6aab.slice/crio-6d21b4ca635b89c25b1312497c7c48661f26fb2870d0fee384c6f7f828540f6a WatchSource:0}: Error finding container 6d21b4ca635b89c25b1312497c7c48661f26fb2870d0fee384c6f7f828540f6a: Status 404 returned error can't find the container with id 6d21b4ca635b89c25b1312497c7c48661f26fb2870d0fee384c6f7f828540f6a Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.742624 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-config\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.742855 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-httpd-config\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.742895 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-ovndb-tls-certs\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.742945 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmg9\" (UniqueName: \"kubernetes.io/projected/01721db4-0a32-46e7-a617-4f7369599b6e-kube-api-access-5kmg9\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.742978 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-combined-ca-bundle\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.753013 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-config\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.754769 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-httpd-config\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.755708 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-combined-ca-bundle\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.756263 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-ovndb-tls-certs\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.775319 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmg9\" (UniqueName: \"kubernetes.io/projected/01721db4-0a32-46e7-a617-4f7369599b6e-kube-api-access-5kmg9\") pod \"neutron-5f79fcff88-45kgr\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.865316 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.905043 4865 scope.go:117] "RemoveContainer" containerID="0e5c6174279db9f2c2158509cd1abfdeb6bbf2e625e85d118135e06cfdadfcb0" Dec 05 06:12:22 crc kubenswrapper[4865]: I1205 06:12:22.998685 4865 scope.go:117] "RemoveContainer" containerID="72ef5c16fd4daa3dfa79816d71e20a8833a8242effb872e1a5714a6b250c442c" Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.061647 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698c990e-e8b5-4c1b-8177-4e44874f4a44" path="/var/lib/kubelet/pods/698c990e-e8b5-4c1b-8177-4e44874f4a44/volumes" Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.207261 4865 scope.go:117] "RemoveContainer" containerID="2e3329f5bbc3cad0322a9067aade8f131d1e2190f2e20c074b9694cde384d849" Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.257189 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78c59b79fd-5jlv4"] Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.301669 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:12:23 crc kubenswrapper[4865]: W1205 06:12:23.350941 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7a9002_45f0_4787_a5f0_d1dafdb275d2.slice/crio-b7db5c358398a6e320934e5cff598b051c3d63483c68acd4cedcff359b3709a2 WatchSource:0}: Error finding container b7db5c358398a6e320934e5cff598b051c3d63483c68acd4cedcff359b3709a2: Status 404 returned error can't find the container with id b7db5c358398a6e320934e5cff598b051c3d63483c68acd4cedcff359b3709a2 Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.378277 4865 scope.go:117] "RemoveContainer" containerID="d163070852eac0c87032f69fbdb534afbbd8e4f78e69ec919b3b74b72f841eab" Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.404435 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerStarted","Data":"6d21b4ca635b89c25b1312497c7c48661f26fb2870d0fee384c6f7f828540f6a"} Dec 05 06:12:23 crc kubenswrapper[4865]: E1205 06:12:23.482114 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-f7d5b" podUID="b5e4dce7-c9e7-4813-a957-1df502644792" Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.548041 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lcs2x"] Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.628882 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.634762 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mfzbq"] Dec 05 06:12:23 crc kubenswrapper[4865]: W1205 06:12:23.699808 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b75a9f_3535_47bf_8874_6ef496fc894d.slice/crio-3433efb53501752eddcaec701019820a9e1edaf61b56535c60292f7ccf7d22b3 WatchSource:0}: Error finding container 3433efb53501752eddcaec701019820a9e1edaf61b56535c60292f7ccf7d22b3: Status 404 returned error can't find the container with id 3433efb53501752eddcaec701019820a9e1edaf61b56535c60292f7ccf7d22b3 Dec 05 06:12:23 crc kubenswrapper[4865]: I1205 06:12:23.778045 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.083944 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f79fcff88-45kgr"] Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.424211 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f79fcff88-45kgr" event={"ID":"01721db4-0a32-46e7-a617-4f7369599b6e","Type":"ContainerStarted","Data":"84ec100765c2edcd94d28c8ca76ea2c68256d8c94ff82afcfd8cf548e4c54e8a"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.440617 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lcs2x" event={"ID":"96bbdbcb-6f86-41ff-99bc-1af813144fd4","Type":"ContainerStarted","Data":"1041512cc1fbafb37ee520703e17b180cfd4480e7fa2f54343a1bb3e79106b4d"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.440663 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lcs2x" event={"ID":"96bbdbcb-6f86-41ff-99bc-1af813144fd4","Type":"ContainerStarted","Data":"ea15bbe8c32736372af42eea700eef8178ccc75272c73655eef39a07899d5bb5"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.448626 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc8877b8-5bcf-45b4-b224-755711b47627","Type":"ContainerStarted","Data":"e08149c4c3bf4457e46d419decfc6f51437ed7155c1173db523079a3fd41582a"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.449817 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mn58x" event={"ID":"1de4159c-2d90-4b3a-bcff-84f293a59c35","Type":"ContainerStarted","Data":"df462c8b519563ff848c3dfe981818298a9b61625006590653518fd116417fd6"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.451190 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c59b79fd-5jlv4" event={"ID":"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9","Type":"ContainerStarted","Data":"eb45ab7ead630dd49388ef7aca567b6d4bae0367ddc59fe21d48b85d95fff5a1"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.451218 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c59b79fd-5jlv4" event={"ID":"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9","Type":"ContainerStarted","Data":"03be9c1d4785de5a2cf1207426b7a0456c69ca1a4a29ca58dfa6ec74869d5aa9"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.455651 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xs9sp" event={"ID":"4f798138-a4f1-490f-8904-cfccbf0db793","Type":"ContainerStarted","Data":"cad80ed9ffc04d8f5694f91d5bd9697516287fb44fa1b3cf466f846b4b5584b7"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.458871 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerStarted","Data":"ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.463304 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a7a9002-45f0-4787-a5f0-d1dafdb275d2","Type":"ContainerStarted","Data":"b7db5c358398a6e320934e5cff598b051c3d63483c68acd4cedcff359b3709a2"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.469188 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lcs2x" podStartSLOduration=28.469169929 podStartE2EDuration="28.469169929s" podCreationTimestamp="2025-12-05 06:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:24.467337277 +0000 UTC m=+1163.747348499" watchObservedRunningTime="2025-12-05 06:12:24.469169929 +0000 UTC m=+1163.749181151" Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.470081 4865 generic.go:334] "Generic (PLEG): container finished" podID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerID="0e3e6180f439fcc8142a5ed3a65119081f4d9961b8cb401e68e2512cd729e75e" exitCode=0 Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.470194 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" event={"ID":"62b75a9f-3535-47bf-8874-6ef496fc894d","Type":"ContainerDied","Data":"0e3e6180f439fcc8142a5ed3a65119081f4d9961b8cb401e68e2512cd729e75e"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.470232 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" event={"ID":"62b75a9f-3535-47bf-8874-6ef496fc894d","Type":"ContainerStarted","Data":"3433efb53501752eddcaec701019820a9e1edaf61b56535c60292f7ccf7d22b3"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.486001 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"39937e10b37729de9655b631fb05427006e716f9ab3edcd0d9c7edbbc9b5832a"} Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.491062 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mn58x" podStartSLOduration=3.5185867159999997 podStartE2EDuration="42.491041867s" podCreationTimestamp="2025-12-05 06:11:42 +0000 UTC" firstStartedPulling="2025-12-05 06:11:44.026652918 +0000 UTC m=+1123.306664140" lastFinishedPulling="2025-12-05 06:12:22.999108069 +0000 UTC m=+1162.279119291" observedRunningTime="2025-12-05 06:12:24.485393828 +0000 UTC m=+1163.765405050" watchObservedRunningTime="2025-12-05 06:12:24.491041867 +0000 UTC m=+1163.771053089" Dec 05 06:12:24 crc kubenswrapper[4865]: I1205 06:12:24.530141 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xs9sp" podStartSLOduration=5.661773312 podStartE2EDuration="42.530113643s" podCreationTimestamp="2025-12-05 06:11:42 +0000 UTC" firstStartedPulling="2025-12-05 06:11:43.912326313 +0000 UTC m=+1123.192337535" lastFinishedPulling="2025-12-05 06:12:20.780666644 +0000 UTC m=+1160.060677866" observedRunningTime="2025-12-05 06:12:24.502976415 +0000 UTC m=+1163.782987637" watchObservedRunningTime="2025-12-05 06:12:24.530113643 +0000 UTC m=+1163.810124855" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.538350 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a7a9002-45f0-4787-a5f0-d1dafdb275d2","Type":"ContainerStarted","Data":"55cac1ca6d6fd7d44cca6d53a391953f66a7a4cbb953929e5d427c25c991fae1"} Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.551092 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc8877b8-5bcf-45b4-b224-755711b47627","Type":"ContainerStarted","Data":"645f42338fde723343788844eabcb62f269953b6fb35812c65fe83b79a7c282a"} Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.712169 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67df96fc59-crcwg"] Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.714105 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.720881 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.720925 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.769542 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67df96fc59-crcwg"] Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834685 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-internal-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834739 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47n4t\" (UniqueName: \"kubernetes.io/projected/b374397b-c64c-439b-b7eb-01d2fb34f474-kube-api-access-47n4t\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834803 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-public-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834893 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-httpd-config\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834918 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-combined-ca-bundle\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834932 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-ovndb-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.834973 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-config\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936548 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-internal-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936604 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47n4t\" (UniqueName: \"kubernetes.io/projected/b374397b-c64c-439b-b7eb-01d2fb34f474-kube-api-access-47n4t\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936664 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-public-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936696 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-httpd-config\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936716 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-combined-ca-bundle\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936729 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-ovndb-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.936763 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-config\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.948241 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-config\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.952714 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-combined-ca-bundle\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.953257 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-internal-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.954225 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-public-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.958785 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-ovndb-tls-certs\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:25 crc kubenswrapper[4865]: I1205 06:12:25.969743 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b374397b-c64c-439b-b7eb-01d2fb34f474-httpd-config\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.006557 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47n4t\" (UniqueName: \"kubernetes.io/projected/b374397b-c64c-439b-b7eb-01d2fb34f474-kube-api-access-47n4t\") pod \"neutron-67df96fc59-crcwg\" (UID: \"b374397b-c64c-439b-b7eb-01d2fb34f474\") " pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.107294 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.585679 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" event={"ID":"62b75a9f-3535-47bf-8874-6ef496fc894d","Type":"ContainerStarted","Data":"052f17d7dd4d5eed035c98d02e3d8a4d7ff7ec6304471171514c3b6d3e20fbab"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.586047 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.611031 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f79fcff88-45kgr" event={"ID":"01721db4-0a32-46e7-a617-4f7369599b6e","Type":"ContainerStarted","Data":"e3628b4b79584bb90b59b4cb9baadea29c22e95a83c5a7989fb8be1a18ede308"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.611076 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f79fcff88-45kgr" event={"ID":"01721db4-0a32-46e7-a617-4f7369599b6e","Type":"ContainerStarted","Data":"c89de8fc3ea0e65295dce7889cb2efc05c7c869baa641ccded41055166dae720"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.612084 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.624628 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" podStartSLOduration=4.62460369 podStartE2EDuration="4.62460369s" podCreationTimestamp="2025-12-05 06:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:26.610841641 +0000 UTC m=+1165.890852863" watchObservedRunningTime="2025-12-05 06:12:26.62460369 +0000 UTC m=+1165.904614912" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.633484 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c59b79fd-5jlv4" event={"ID":"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9","Type":"ContainerStarted","Data":"2c566b0fa9fad49639e8ef5098b129e44f8c6799cb1513e54a1766170e2190fd"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.641642 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66","Type":"ContainerStarted","Data":"260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.648352 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f79fcff88-45kgr" podStartSLOduration=4.648326801 podStartE2EDuration="4.648326801s" podCreationTimestamp="2025-12-05 06:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:26.638067771 +0000 UTC m=+1165.918078993" watchObservedRunningTime="2025-12-05 06:12:26.648326801 +0000 UTC m=+1165.928338023" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.665259 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerStarted","Data":"ae9988b24b0cc529f27a61e58c049c77ec8edcedb21946f5111a11587be650d1"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.668992 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78c59b79fd-5jlv4" podStartSLOduration=36.100571504 podStartE2EDuration="36.668975595s" podCreationTimestamp="2025-12-05 06:11:50 +0000 UTC" firstStartedPulling="2025-12-05 06:12:23.378706868 +0000 UTC m=+1162.658718090" lastFinishedPulling="2025-12-05 06:12:23.947110969 +0000 UTC m=+1163.227122181" observedRunningTime="2025-12-05 06:12:26.665787825 +0000 UTC m=+1165.945799047" watchObservedRunningTime="2025-12-05 06:12:26.668975595 +0000 UTC m=+1165.948986817" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.675266 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc8877b8-5bcf-45b4-b224-755711b47627","Type":"ContainerStarted","Data":"23a7bde9ff9e34c33448cbbc01f9a8a20ddaed84638f360243a30c3f4643b9b0"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.703619 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a7a9002-45f0-4787-a5f0-d1dafdb275d2","Type":"ContainerStarted","Data":"899e2f12ffb610d91c186ad3ed88dffe94d5245e9912c5f90b2d8c33d6271291"} Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.707571 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-bd68dd9b8-z62zt" podStartSLOduration=35.937495511 podStartE2EDuration="36.707547907s" podCreationTimestamp="2025-12-05 06:11:50 +0000 UTC" firstStartedPulling="2025-12-05 06:12:22.743146167 +0000 UTC m=+1162.023157389" lastFinishedPulling="2025-12-05 06:12:23.513198563 +0000 UTC m=+1162.793209785" observedRunningTime="2025-12-05 06:12:26.69245274 +0000 UTC m=+1165.972463962" watchObservedRunningTime="2025-12-05 06:12:26.707547907 +0000 UTC m=+1165.987559129" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.726202 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=30.726180644 podStartE2EDuration="30.726180644s" podCreationTimestamp="2025-12-05 06:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:26.724602959 +0000 UTC m=+1166.004614171" watchObservedRunningTime="2025-12-05 06:12:26.726180644 +0000 UTC m=+1166.006191866" Dec 05 06:12:26 crc kubenswrapper[4865]: I1205 06:12:26.765547 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=34.765526527 podStartE2EDuration="34.765526527s" podCreationTimestamp="2025-12-05 06:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:26.755416551 +0000 UTC m=+1166.035427763" watchObservedRunningTime="2025-12-05 06:12:26.765526527 +0000 UTC m=+1166.045537749" Dec 05 06:12:27 crc kubenswrapper[4865]: I1205 06:12:27.042007 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67df96fc59-crcwg"] Dec 05 06:12:27 crc kubenswrapper[4865]: I1205 06:12:27.810776 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67df96fc59-crcwg" event={"ID":"b374397b-c64c-439b-b7eb-01d2fb34f474","Type":"ContainerStarted","Data":"8a9705a44ff3797ea1d05bbb97476e37b4c5c5e974b22d1d20b26836cbf062b9"} Dec 05 06:12:27 crc kubenswrapper[4865]: I1205 06:12:27.811107 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67df96fc59-crcwg" event={"ID":"b374397b-c64c-439b-b7eb-01d2fb34f474","Type":"ContainerStarted","Data":"0143478ab6818077a1d28158515fdb3301d25b9f51e2640358894a4d58bb0248"} Dec 05 06:12:28 crc kubenswrapper[4865]: I1205 06:12:28.839594 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67df96fc59-crcwg" event={"ID":"b374397b-c64c-439b-b7eb-01d2fb34f474","Type":"ContainerStarted","Data":"fd98051c7f8b90adfdaa5b38197f9dd08aef543980a7e6a6a675600a87798b6a"} Dec 05 06:12:28 crc kubenswrapper[4865]: I1205 06:12:28.900208 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67df96fc59-crcwg" podStartSLOduration=3.90018902 podStartE2EDuration="3.90018902s" podCreationTimestamp="2025-12-05 06:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:28.879203777 +0000 UTC m=+1168.159214999" watchObservedRunningTime="2025-12-05 06:12:28.90018902 +0000 UTC m=+1168.180200242" Dec 05 06:12:29 crc kubenswrapper[4865]: I1205 06:12:29.852304 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:30 crc kubenswrapper[4865]: I1205 06:12:30.875252 4865 generic.go:334] "Generic (PLEG): container finished" podID="4f798138-a4f1-490f-8904-cfccbf0db793" containerID="cad80ed9ffc04d8f5694f91d5bd9697516287fb44fa1b3cf466f846b4b5584b7" exitCode=0 Dec 05 06:12:30 crc kubenswrapper[4865]: I1205 06:12:30.875643 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xs9sp" event={"ID":"4f798138-a4f1-490f-8904-cfccbf0db793","Type":"ContainerDied","Data":"cad80ed9ffc04d8f5694f91d5bd9697516287fb44fa1b3cf466f846b4b5584b7"} Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.155602 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.156220 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.298487 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.298538 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.884400 4865 generic.go:334] "Generic (PLEG): container finished" podID="1de4159c-2d90-4b3a-bcff-84f293a59c35" containerID="df462c8b519563ff848c3dfe981818298a9b61625006590653518fd116417fd6" exitCode=0 Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.884660 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mn58x" event={"ID":"1de4159c-2d90-4b3a-bcff-84f293a59c35","Type":"ContainerDied","Data":"df462c8b519563ff848c3dfe981818298a9b61625006590653518fd116417fd6"} Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.886638 4865 generic.go:334] "Generic (PLEG): container finished" podID="96bbdbcb-6f86-41ff-99bc-1af813144fd4" containerID="1041512cc1fbafb37ee520703e17b180cfd4480e7fa2f54343a1bb3e79106b4d" exitCode=0 Dec 05 06:12:31 crc kubenswrapper[4865]: I1205 06:12:31.886744 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lcs2x" event={"ID":"96bbdbcb-6f86-41ff-99bc-1af813144fd4","Type":"ContainerDied","Data":"1041512cc1fbafb37ee520703e17b180cfd4480e7fa2f54343a1bb3e79106b4d"} Dec 05 06:12:32 crc kubenswrapper[4865]: I1205 06:12:32.513021 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:32 crc kubenswrapper[4865]: I1205 06:12:32.588527 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bzcl4"] Dec 05 06:12:32 crc kubenswrapper[4865]: I1205 06:12:32.588782 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="dnsmasq-dns" containerID="cri-o://f850146e71caab5731ce1a161a22dbecb87ccd602f550f39d2da944128969cff" gracePeriod=10 Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.177321 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.320077 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.320125 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.407521 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.421103 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.908578 4865 generic.go:334] "Generic (PLEG): container finished" podID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerID="f850146e71caab5731ce1a161a22dbecb87ccd602f550f39d2da944128969cff" exitCode=0 Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.909449 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" event={"ID":"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828","Type":"ContainerDied","Data":"f850146e71caab5731ce1a161a22dbecb87ccd602f550f39d2da944128969cff"} Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.909881 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:33 crc kubenswrapper[4865]: I1205 06:12:33.909905 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:35 crc kubenswrapper[4865]: I1205 06:12:35.945615 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:12:35 crc kubenswrapper[4865]: I1205 06:12:35.945651 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.413059 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mn58x" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.416139 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xs9sp" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.417412 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.592289 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-combined-ca-bundle\") pod \"1de4159c-2d90-4b3a-bcff-84f293a59c35\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.592996 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-fernet-keys\") pod \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593057 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-credential-keys\") pod \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593093 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-config-data\") pod \"4f798138-a4f1-490f-8904-cfccbf0db793\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593123 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-config-data\") pod \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593143 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lvml\" (UniqueName: \"kubernetes.io/projected/96bbdbcb-6f86-41ff-99bc-1af813144fd4-kube-api-access-2lvml\") pod \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593177 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-db-sync-config-data\") pod \"1de4159c-2d90-4b3a-bcff-84f293a59c35\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593195 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-combined-ca-bundle\") pod \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593244 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f798138-a4f1-490f-8904-cfccbf0db793-logs\") pod \"4f798138-a4f1-490f-8904-cfccbf0db793\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593260 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-combined-ca-bundle\") pod \"4f798138-a4f1-490f-8904-cfccbf0db793\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593302 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-scripts\") pod \"4f798138-a4f1-490f-8904-cfccbf0db793\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593319 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-scripts\") pod \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\" (UID: \"96bbdbcb-6f86-41ff-99bc-1af813144fd4\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593335 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5dd2\" (UniqueName: \"kubernetes.io/projected/1de4159c-2d90-4b3a-bcff-84f293a59c35-kube-api-access-z5dd2\") pod \"1de4159c-2d90-4b3a-bcff-84f293a59c35\" (UID: \"1de4159c-2d90-4b3a-bcff-84f293a59c35\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.593365 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwnkp\" (UniqueName: \"kubernetes.io/projected/4f798138-a4f1-490f-8904-cfccbf0db793-kube-api-access-hwnkp\") pod \"4f798138-a4f1-490f-8904-cfccbf0db793\" (UID: \"4f798138-a4f1-490f-8904-cfccbf0db793\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.599090 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f798138-a4f1-490f-8904-cfccbf0db793-logs" (OuterVolumeSpecName: "logs") pod "4f798138-a4f1-490f-8904-cfccbf0db793" (UID: "4f798138-a4f1-490f-8904-cfccbf0db793"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.601281 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f798138-a4f1-490f-8904-cfccbf0db793-kube-api-access-hwnkp" (OuterVolumeSpecName: "kube-api-access-hwnkp") pod "4f798138-a4f1-490f-8904-cfccbf0db793" (UID: "4f798138-a4f1-490f-8904-cfccbf0db793"). InnerVolumeSpecName "kube-api-access-hwnkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.612660 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "96bbdbcb-6f86-41ff-99bc-1af813144fd4" (UID: "96bbdbcb-6f86-41ff-99bc-1af813144fd4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.612705 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "96bbdbcb-6f86-41ff-99bc-1af813144fd4" (UID: "96bbdbcb-6f86-41ff-99bc-1af813144fd4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.613094 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-scripts" (OuterVolumeSpecName: "scripts") pod "96bbdbcb-6f86-41ff-99bc-1af813144fd4" (UID: "96bbdbcb-6f86-41ff-99bc-1af813144fd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.614410 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.614758 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.614915 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-scripts" (OuterVolumeSpecName: "scripts") pod "4f798138-a4f1-490f-8904-cfccbf0db793" (UID: "4f798138-a4f1-490f-8904-cfccbf0db793"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.615719 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1de4159c-2d90-4b3a-bcff-84f293a59c35" (UID: "1de4159c-2d90-4b3a-bcff-84f293a59c35"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.636463 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de4159c-2d90-4b3a-bcff-84f293a59c35-kube-api-access-z5dd2" (OuterVolumeSpecName: "kube-api-access-z5dd2") pod "1de4159c-2d90-4b3a-bcff-84f293a59c35" (UID: "1de4159c-2d90-4b3a-bcff-84f293a59c35"). InnerVolumeSpecName "kube-api-access-z5dd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.652961 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bbdbcb-6f86-41ff-99bc-1af813144fd4-kube-api-access-2lvml" (OuterVolumeSpecName: "kube-api-access-2lvml") pod "96bbdbcb-6f86-41ff-99bc-1af813144fd4" (UID: "96bbdbcb-6f86-41ff-99bc-1af813144fd4"). InnerVolumeSpecName "kube-api-access-2lvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.681095 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1de4159c-2d90-4b3a-bcff-84f293a59c35" (UID: "1de4159c-2d90-4b3a-bcff-84f293a59c35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697315 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f798138-a4f1-490f-8904-cfccbf0db793-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697358 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697397 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697414 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5dd2\" (UniqueName: \"kubernetes.io/projected/1de4159c-2d90-4b3a-bcff-84f293a59c35-kube-api-access-z5dd2\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697429 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwnkp\" (UniqueName: \"kubernetes.io/projected/4f798138-a4f1-490f-8904-cfccbf0db793-kube-api-access-hwnkp\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697441 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697452 4865 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697495 4865 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697507 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lvml\" (UniqueName: \"kubernetes.io/projected/96bbdbcb-6f86-41ff-99bc-1af813144fd4-kube-api-access-2lvml\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.697518 4865 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1de4159c-2d90-4b3a-bcff-84f293a59c35-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.710815 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f798138-a4f1-490f-8904-cfccbf0db793" (UID: "4f798138-a4f1-490f-8904-cfccbf0db793"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.711337 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-config-data" (OuterVolumeSpecName: "config-data") pod "4f798138-a4f1-490f-8904-cfccbf0db793" (UID: "4f798138-a4f1-490f-8904-cfccbf0db793"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.730219 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96bbdbcb-6f86-41ff-99bc-1af813144fd4" (UID: "96bbdbcb-6f86-41ff-99bc-1af813144fd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.733950 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-config-data" (OuterVolumeSpecName: "config-data") pod "96bbdbcb-6f86-41ff-99bc-1af813144fd4" (UID: "96bbdbcb-6f86-41ff-99bc-1af813144fd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.745746 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.749855 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.794587 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.798686 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.798711 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.798719 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bbdbcb-6f86-41ff-99bc-1af813144fd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.798731 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f798138-a4f1-490f-8904-cfccbf0db793-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.899733 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-swift-storage-0\") pod \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.899800 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-nb\") pod \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.899963 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-config\") pod \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.900025 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-svc\") pod \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.900095 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j8fk\" (UniqueName: \"kubernetes.io/projected/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-kube-api-access-4j8fk\") pod \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.900116 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-sb\") pod \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\" (UID: \"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828\") " Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.904250 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-kube-api-access-4j8fk" (OuterVolumeSpecName: "kube-api-access-4j8fk") pod "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" (UID: "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828"). InnerVolumeSpecName "kube-api-access-4j8fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.943387 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" (UID: "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.944009 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" (UID: "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.949981 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" (UID: "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.961232 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-config" (OuterVolumeSpecName: "config") pod "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" (UID: "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.961927 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lcs2x" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.962075 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lcs2x" event={"ID":"96bbdbcb-6f86-41ff-99bc-1af813144fd4","Type":"ContainerDied","Data":"ea15bbe8c32736372af42eea700eef8178ccc75272c73655eef39a07899d5bb5"} Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.962119 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea15bbe8c32736372af42eea700eef8178ccc75272c73655eef39a07899d5bb5" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.963636 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mn58x" event={"ID":"1de4159c-2d90-4b3a-bcff-84f293a59c35","Type":"ContainerDied","Data":"1c28652e943ec547456885c1223bec0b8320d07fd74ceb9c535f6417e5ff62d1"} Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.963687 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c28652e943ec547456885c1223bec0b8320d07fd74ceb9c535f6417e5ff62d1" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.963754 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mn58x" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.966087 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" (UID: "b6bede1f-8e18-4ccd-8f12-8bf7df2e7828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.978640 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" event={"ID":"b6bede1f-8e18-4ccd-8f12-8bf7df2e7828","Type":"ContainerDied","Data":"9a3e6a681a3208685c7ecfbd724a67c22142792e36504c325612b65585149a8b"} Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.978708 4865 scope.go:117] "RemoveContainer" containerID="f850146e71caab5731ce1a161a22dbecb87ccd602f550f39d2da944128969cff" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.978928 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bzcl4" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.984905 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xs9sp" event={"ID":"4f798138-a4f1-490f-8904-cfccbf0db793","Type":"ContainerDied","Data":"d71e34d52561bb90448f19a0adf2a8fe5ca14d211a0fb73d9e7ebac5cc05cf9b"} Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.984948 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71e34d52561bb90448f19a0adf2a8fe5ca14d211a0fb73d9e7ebac5cc05cf9b" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.984969 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xs9sp" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.985583 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 06:12:36 crc kubenswrapper[4865]: I1205 06:12:36.986124 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.002147 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.002186 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j8fk\" (UniqueName: \"kubernetes.io/projected/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-kube-api-access-4j8fk\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.002201 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.002214 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.002225 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.002237 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.038328 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bzcl4"] Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.045376 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bzcl4"] Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.065514 4865 scope.go:117] "RemoveContainer" containerID="4b22ac2b2285bfbb5b6e3827affd91b577617a6e333e8f46de9ae23685a94e0a" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.787519 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-95f789cff-nbpm9"] Dec 05 06:12:37 crc kubenswrapper[4865]: E1205 06:12:37.788385 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de4159c-2d90-4b3a-bcff-84f293a59c35" containerName="barbican-db-sync" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.788455 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de4159c-2d90-4b3a-bcff-84f293a59c35" containerName="barbican-db-sync" Dec 05 06:12:37 crc kubenswrapper[4865]: E1205 06:12:37.798061 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f798138-a4f1-490f-8904-cfccbf0db793" containerName="placement-db-sync" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.798277 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f798138-a4f1-490f-8904-cfccbf0db793" containerName="placement-db-sync" Dec 05 06:12:37 crc kubenswrapper[4865]: E1205 06:12:37.798383 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bbdbcb-6f86-41ff-99bc-1af813144fd4" containerName="keystone-bootstrap" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.798455 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bbdbcb-6f86-41ff-99bc-1af813144fd4" containerName="keystone-bootstrap" Dec 05 06:12:37 crc kubenswrapper[4865]: E1205 06:12:37.798536 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="init" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.798586 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="init" Dec 05 06:12:37 crc kubenswrapper[4865]: E1205 06:12:37.798662 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="dnsmasq-dns" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.798729 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="dnsmasq-dns" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.799185 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de4159c-2d90-4b3a-bcff-84f293a59c35" containerName="barbican-db-sync" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.800328 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bbdbcb-6f86-41ff-99bc-1af813144fd4" containerName="keystone-bootstrap" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.800427 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" containerName="dnsmasq-dns" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.800501 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f798138-a4f1-490f-8904-cfccbf0db793" containerName="placement-db-sync" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.801749 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.832022 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.839262 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tthks" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.839519 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.853607 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-668bb48dd6-6gzl7"] Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.855469 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.886798 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.888184 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.889696 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.890451 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nq2gr" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.900343 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 06:12:37 crc kubenswrapper[4865]: I1205 06:12:37.921020 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.023372 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-95f789cff-nbpm9"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.073499 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6564bc679b-dbsbx"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075468 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-internal-tls-certs\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075551 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-config-data\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075603 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-fernet-keys\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075654 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-scripts\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075674 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-public-tls-certs\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075696 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-config-data\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075716 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-combined-ca-bundle\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075738 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527588f6-952d-4f9c-990c-775b34d48d78-logs\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075765 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2hcw\" (UniqueName: \"kubernetes.io/projected/52184630-757a-4290-a4a0-380b5ffb1c76-kube-api-access-k2hcw\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.075894 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67chb\" (UniqueName: \"kubernetes.io/projected/527588f6-952d-4f9c-990c-775b34d48d78-kube-api-access-67chb\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.076104 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-credential-keys\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.076140 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-combined-ca-bundle\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.076206 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-config-data-custom\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.076714 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.089155 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.137675 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-699b5d9784-7n29d"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.139402 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.146230 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-668bb48dd6-6gzl7"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.151053 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.155729 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.155938 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wwz5j" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.156069 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.156244 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.176071 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6564bc679b-dbsbx"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179276 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-internal-tls-certs\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179311 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-config-data\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179343 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-fernet-keys\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179371 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-scripts\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179385 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-public-tls-certs\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179403 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-config-data\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179420 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-combined-ca-bundle\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179433 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527588f6-952d-4f9c-990c-775b34d48d78-logs\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179449 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2hcw\" (UniqueName: \"kubernetes.io/projected/52184630-757a-4290-a4a0-380b5ffb1c76-kube-api-access-k2hcw\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179496 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67chb\" (UniqueName: \"kubernetes.io/projected/527588f6-952d-4f9c-990c-775b34d48d78-kube-api-access-67chb\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179549 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-credential-keys\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179571 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-combined-ca-bundle\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.179605 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-config-data-custom\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.186442 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527588f6-952d-4f9c-990c-775b34d48d78-logs\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.200567 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-config-data-custom\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.200638 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-699b5d9784-7n29d"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.210368 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-credential-keys\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.212564 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-combined-ca-bundle\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.214699 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-config-data\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.218227 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-scripts\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.222961 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-fernet-keys\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.227362 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-public-tls-certs\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.236433 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67chb\" (UniqueName: \"kubernetes.io/projected/527588f6-952d-4f9c-990c-775b34d48d78-kube-api-access-67chb\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.240575 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-config-data\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.240661 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-xhd48"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.245318 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52184630-757a-4290-a4a0-380b5ffb1c76-internal-tls-certs\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.251050 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.261313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2hcw\" (UniqueName: \"kubernetes.io/projected/52184630-757a-4290-a4a0-380b5ffb1c76-kube-api-access-k2hcw\") pod \"keystone-668bb48dd6-6gzl7\" (UID: \"52184630-757a-4290-a4a0-380b5ffb1c76\") " pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.261856 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527588f6-952d-4f9c-990c-775b34d48d78-combined-ca-bundle\") pod \"barbican-worker-95f789cff-nbpm9\" (UID: \"527588f6-952d-4f9c-990c-775b34d48d78\") " pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.272941 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-xhd48"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282144 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-scripts\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282216 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-internal-tls-certs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282275 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-public-tls-certs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282291 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-config-data\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282309 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-combined-ca-bundle\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282344 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-config-data-custom\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282388 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-config-data\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282418 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-combined-ca-bundle\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282439 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e70007-d815-432e-9cb5-bc2cc61a86fa-logs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282464 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9tvg\" (UniqueName: \"kubernetes.io/projected/44e70007-d815-432e-9cb5-bc2cc61a86fa-kube-api-access-x9tvg\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282480 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8vs\" (UniqueName: \"kubernetes.io/projected/b71ad914-2c87-4cd5-94ad-ffc717f3600a-kube-api-access-9m8vs\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.282555 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71ad914-2c87-4cd5-94ad-ffc717f3600a-logs\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.315204 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-595958cf6d-v6bd6"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.316803 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.323805 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.336877 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-595958cf6d-v6bd6"] Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.389907 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-public-tls-certs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390156 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-config-data\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390238 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-svc\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390315 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-combined-ca-bundle\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390405 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-config-data-custom\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390479 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390551 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390635 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-config\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390718 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-config-data\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390801 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-combined-ca-bundle\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.390893 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e70007-d815-432e-9cb5-bc2cc61a86fa-logs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391549 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9tvg\" (UniqueName: \"kubernetes.io/projected/44e70007-d815-432e-9cb5-bc2cc61a86fa-kube-api-access-x9tvg\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391587 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8vs\" (UniqueName: \"kubernetes.io/projected/b71ad914-2c87-4cd5-94ad-ffc717f3600a-kube-api-access-9m8vs\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391739 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71ad914-2c87-4cd5-94ad-ffc717f3600a-logs\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391769 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-scripts\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391818 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-internal-tls-certs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391865 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s67cz\" (UniqueName: \"kubernetes.io/projected/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-kube-api-access-s67cz\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.391886 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.392155 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e70007-d815-432e-9cb5-bc2cc61a86fa-logs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.393021 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b71ad914-2c87-4cd5-94ad-ffc717f3600a-logs\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.395182 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-public-tls-certs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.399851 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-config-data-custom\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.401456 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-config-data\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.406488 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-combined-ca-bundle\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.407655 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71ad914-2c87-4cd5-94ad-ffc717f3600a-config-data\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.412232 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-combined-ca-bundle\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.421345 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-scripts\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.424291 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e70007-d815-432e-9cb5-bc2cc61a86fa-internal-tls-certs\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.430489 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8vs\" (UniqueName: \"kubernetes.io/projected/b71ad914-2c87-4cd5-94ad-ffc717f3600a-kube-api-access-9m8vs\") pod \"barbican-keystone-listener-6564bc679b-dbsbx\" (UID: \"b71ad914-2c87-4cd5-94ad-ffc717f3600a\") " pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.454247 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-95f789cff-nbpm9" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.462618 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9tvg\" (UniqueName: \"kubernetes.io/projected/44e70007-d815-432e-9cb5-bc2cc61a86fa-kube-api-access-x9tvg\") pod \"placement-699b5d9784-7n29d\" (UID: \"44e70007-d815-432e-9cb5-bc2cc61a86fa\") " pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.468584 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493681 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1eecac4-4210-4a9d-9d8a-bcf21327c712-logs\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493786 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s67cz\" (UniqueName: \"kubernetes.io/projected/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-kube-api-access-s67cz\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493810 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493842 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmhv\" (UniqueName: \"kubernetes.io/projected/b1eecac4-4210-4a9d-9d8a-bcf21327c712-kube-api-access-9lmhv\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493873 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data-custom\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493913 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-svc\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493937 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493955 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.493975 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-config\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.494003 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-combined-ca-bundle\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.494042 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.495174 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.495202 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.495292 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-svc\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.495313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-config\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.495622 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.512429 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s67cz\" (UniqueName: \"kubernetes.io/projected/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-kube-api-access-s67cz\") pod \"dnsmasq-dns-688c87cc99-xhd48\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.547701 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.596021 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1eecac4-4210-4a9d-9d8a-bcf21327c712-logs\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.596123 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmhv\" (UniqueName: \"kubernetes.io/projected/b1eecac4-4210-4a9d-9d8a-bcf21327c712-kube-api-access-9lmhv\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.596152 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data-custom\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.596233 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-combined-ca-bundle\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.596273 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.597013 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1eecac4-4210-4a9d-9d8a-bcf21327c712-logs\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.599980 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-combined-ca-bundle\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.600429 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.602382 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data-custom\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.619313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmhv\" (UniqueName: \"kubernetes.io/projected/b1eecac4-4210-4a9d-9d8a-bcf21327c712-kube-api-access-9lmhv\") pod \"barbican-api-595958cf6d-v6bd6\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.651721 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.661921 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:38 crc kubenswrapper[4865]: I1205 06:12:38.710256 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.023475 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bede1f-8e18-4ccd-8f12-8bf7df2e7828" path="/var/lib/kubelet/pods/b6bede1f-8e18-4ccd-8f12-8bf7df2e7828/volumes" Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.097959 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.097988 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.774474 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6564bc679b-dbsbx"] Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.831793 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-668bb48dd6-6gzl7"] Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.874895 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-699b5d9784-7n29d"] Dec 05 06:12:39 crc kubenswrapper[4865]: W1205 06:12:39.883032 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52184630_757a_4290_a4a0_380b5ffb1c76.slice/crio-99525f28b40552a1ffe450d2f94ffa2ee9924af4d83735d22ee0914be09f3ac8 WatchSource:0}: Error finding container 99525f28b40552a1ffe450d2f94ffa2ee9924af4d83735d22ee0914be09f3ac8: Status 404 returned error can't find the container with id 99525f28b40552a1ffe450d2f94ffa2ee9924af4d83735d22ee0914be09f3ac8 Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.921892 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-95f789cff-nbpm9"] Dec 05 06:12:39 crc kubenswrapper[4865]: I1205 06:12:39.943677 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-595958cf6d-v6bd6"] Dec 05 06:12:39 crc kubenswrapper[4865]: W1205 06:12:39.967208 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e70007_d815_432e_9cb5_bc2cc61a86fa.slice/crio-fb82c4cd994e7ab883ed19c20fda98be36ffdd090feea3a257a79a0ce4882f78 WatchSource:0}: Error finding container fb82c4cd994e7ab883ed19c20fda98be36ffdd090feea3a257a79a0ce4882f78: Status 404 returned error can't find the container with id fb82c4cd994e7ab883ed19c20fda98be36ffdd090feea3a257a79a0ce4882f78 Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.085327 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-xhd48"] Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.165447 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" event={"ID":"b71ad914-2c87-4cd5-94ad-ffc717f3600a","Type":"ContainerStarted","Data":"f15c5694b0d48c0cf9ed2bd7664730c926c2eda644035acb39420e1fd59aa093"} Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.245697 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-668bb48dd6-6gzl7" event={"ID":"52184630-757a-4290-a4a0-380b5ffb1c76","Type":"ContainerStarted","Data":"99525f28b40552a1ffe450d2f94ffa2ee9924af4d83735d22ee0914be09f3ac8"} Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.256254 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-95f789cff-nbpm9" event={"ID":"527588f6-952d-4f9c-990c-775b34d48d78","Type":"ContainerStarted","Data":"b7dd2df8e6fda4618afd520eee213763a7ed4b6b6989db7c112590eec99a0d26"} Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.262127 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-699b5d9784-7n29d" event={"ID":"44e70007-d815-432e-9cb5-bc2cc61a86fa","Type":"ContainerStarted","Data":"fb82c4cd994e7ab883ed19c20fda98be36ffdd090feea3a257a79a0ce4882f78"} Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.271674 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66","Type":"ContainerStarted","Data":"a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d"} Dec 05 06:12:40 crc kubenswrapper[4865]: I1205 06:12:40.273652 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerStarted","Data":"ab6870366cfc39dc3a6c278c682e40ce16fb286b8f2fe4e2f84ab8c557ceef9f"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.168950 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.307048 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78c59b79fd-5jlv4" podUID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.338986 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerStarted","Data":"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.355304 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7d5b" event={"ID":"b5e4dce7-c9e7-4813-a957-1df502644792","Type":"ContainerStarted","Data":"a9145a9a4c3cb9abfd5ddb8a4ccea3b5c568087506ed84014d0eb1eb2681eb5c"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.367899 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-668bb48dd6-6gzl7" event={"ID":"52184630-757a-4290-a4a0-380b5ffb1c76","Type":"ContainerStarted","Data":"72662b7df0577f9169b6962956446dcf080a903135808e88de1227523e4dfc6f"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.369029 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.392047 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-699b5d9784-7n29d" event={"ID":"44e70007-d815-432e-9cb5-bc2cc61a86fa","Type":"ContainerStarted","Data":"7395eecdd25a1d91431b5dc7cbe0a326aa8f69313eafeb557c4d33ccee2734d3"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.438102 4865 generic.go:334] "Generic (PLEG): container finished" podID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerID="25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b" exitCode=0 Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.438154 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" event={"ID":"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd","Type":"ContainerDied","Data":"25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.438180 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" event={"ID":"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd","Type":"ContainerStarted","Data":"f88e220c982065db6825ad7b54ecd876bfd1f828a376c3119cc75f6ceb0f2b89"} Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.461386 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f7d5b" podStartSLOduration=5.05676176 podStartE2EDuration="1m0.461361002s" podCreationTimestamp="2025-12-05 06:11:41 +0000 UTC" firstStartedPulling="2025-12-05 06:11:43.591083135 +0000 UTC m=+1122.871094347" lastFinishedPulling="2025-12-05 06:12:38.995682367 +0000 UTC m=+1178.275693589" observedRunningTime="2025-12-05 06:12:41.399310296 +0000 UTC m=+1180.679321518" watchObservedRunningTime="2025-12-05 06:12:41.461361002 +0000 UTC m=+1180.741372224" Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.476251 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-668bb48dd6-6gzl7" podStartSLOduration=4.476230986 podStartE2EDuration="4.476230986s" podCreationTimestamp="2025-12-05 06:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:41.426626484 +0000 UTC m=+1180.706637706" watchObservedRunningTime="2025-12-05 06:12:41.476230986 +0000 UTC m=+1180.756242208" Dec 05 06:12:41 crc kubenswrapper[4865]: E1205 06:12:41.650313 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b0dc39_032b_4ed9_ae54_aafa8a0333cd.slice/crio-conmon-25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b0dc39_032b_4ed9_ae54_aafa8a0333cd.slice/crio-25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b.scope\": RecentStats: unable to find data in memory cache]" Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.993075 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c67fc55d6-grhds"] Dec 05 06:12:41 crc kubenswrapper[4865]: I1205 06:12:41.995343 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.002371 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.002423 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019003 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k7dm\" (UniqueName: \"kubernetes.io/projected/115995c2-39bf-4d60-bcf9-ca342384137a-kube-api-access-5k7dm\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019073 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/115995c2-39bf-4d60-bcf9-ca342384137a-logs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019191 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-internal-tls-certs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019244 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-combined-ca-bundle\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019300 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-config-data-custom\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019357 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-config-data\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.019415 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-public-tls-certs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.041441 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c67fc55d6-grhds"] Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.120780 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-config-data\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.124477 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-public-tls-certs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.125037 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k7dm\" (UniqueName: \"kubernetes.io/projected/115995c2-39bf-4d60-bcf9-ca342384137a-kube-api-access-5k7dm\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.125180 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/115995c2-39bf-4d60-bcf9-ca342384137a-logs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.125423 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-internal-tls-certs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.125541 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-combined-ca-bundle\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.125646 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-config-data-custom\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.129739 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/115995c2-39bf-4d60-bcf9-ca342384137a-logs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.140592 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-internal-tls-certs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.141689 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-config-data-custom\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.142045 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-config-data\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.142487 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-public-tls-certs\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.143189 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115995c2-39bf-4d60-bcf9-ca342384137a-combined-ca-bundle\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.165164 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k7dm\" (UniqueName: \"kubernetes.io/projected/115995c2-39bf-4d60-bcf9-ca342384137a-kube-api-access-5k7dm\") pod \"barbican-api-c67fc55d6-grhds\" (UID: \"115995c2-39bf-4d60-bcf9-ca342384137a\") " pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.345781 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.482045 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-699b5d9784-7n29d" event={"ID":"44e70007-d815-432e-9cb5-bc2cc61a86fa","Type":"ContainerStarted","Data":"46afc887656253fd7986433084e59bf83d9c358c421d5e0f4897faf0971653e6"} Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.482898 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.482929 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.493887 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" event={"ID":"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd","Type":"ContainerStarted","Data":"fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3"} Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.494724 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.513535 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerStarted","Data":"88f7b7f11a4aadda8688b3c330bcb99d02949cc4ea6ab76e59b7420018bf5a25"} Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.554664 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-699b5d9784-7n29d" podStartSLOduration=5.554639252 podStartE2EDuration="5.554639252s" podCreationTimestamp="2025-12-05 06:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:42.525293746 +0000 UTC m=+1181.805304968" watchObservedRunningTime="2025-12-05 06:12:42.554639252 +0000 UTC m=+1181.834650464" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.606160 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" podStartSLOduration=4.606137668 podStartE2EDuration="4.606137668s" podCreationTimestamp="2025-12-05 06:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:42.603812091 +0000 UTC m=+1181.883823313" watchObservedRunningTime="2025-12-05 06:12:42.606137668 +0000 UTC m=+1181.886148890" Dec 05 06:12:42 crc kubenswrapper[4865]: I1205 06:12:42.676174 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-595958cf6d-v6bd6" podStartSLOduration=4.676152161 podStartE2EDuration="4.676152161s" podCreationTimestamp="2025-12-05 06:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:42.66034112 +0000 UTC m=+1181.940352342" watchObservedRunningTime="2025-12-05 06:12:42.676152161 +0000 UTC m=+1181.956163383" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.096751 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c67fc55d6-grhds"] Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.540505 4865 generic.go:334] "Generic (PLEG): container finished" podID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerID="88f7b7f11a4aadda8688b3c330bcb99d02949cc4ea6ab76e59b7420018bf5a25" exitCode=1 Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.540576 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerDied","Data":"88f7b7f11a4aadda8688b3c330bcb99d02949cc4ea6ab76e59b7420018bf5a25"} Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.541127 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.541280 4865 scope.go:117] "RemoveContainer" containerID="88f7b7f11a4aadda8688b3c330bcb99d02949cc4ea6ab76e59b7420018bf5a25" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.550640 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c67fc55d6-grhds" event={"ID":"115995c2-39bf-4d60-bcf9-ca342384137a","Type":"ContainerStarted","Data":"649f1aa3a76a0e7ab6f094f14f90d1bb21779266a9e8a4cca34c105b60576af2"} Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.663933 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.695467 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.695572 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.696727 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.798899 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.799037 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:12:43 crc kubenswrapper[4865]: I1205 06:12:43.804647 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 06:12:44 crc kubenswrapper[4865]: I1205 06:12:44.662495 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.663200 4865 generic.go:334] "Generic (PLEG): container finished" podID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" exitCode=1 Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.663665 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerDied","Data":"65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.663903 4865 scope.go:117] "RemoveContainer" containerID="88f7b7f11a4aadda8688b3c330bcb99d02949cc4ea6ab76e59b7420018bf5a25" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.663931 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:12:46 crc kubenswrapper[4865]: E1205 06:12:46.664235 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-595958cf6d-v6bd6_openstack(b1eecac4-4210-4a9d-9d8a-bcf21327c712)\"" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.666188 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.696885 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" event={"ID":"b71ad914-2c87-4cd5-94ad-ffc717f3600a","Type":"ContainerStarted","Data":"3a5540c75d07c661cae838396f3d433bd1d6414f71c12bf08bda026c174c7c59"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.697128 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" event={"ID":"b71ad914-2c87-4cd5-94ad-ffc717f3600a","Type":"ContainerStarted","Data":"e9a2fb0c4f8cf729679868e61a4b652ed92a77608ab5ff03e3c989f90305a4a1"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.703442 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c67fc55d6-grhds" event={"ID":"115995c2-39bf-4d60-bcf9-ca342384137a","Type":"ContainerStarted","Data":"fbe55c1ac72e532d9ec28e1f36d45639b5ae404e25e82324e2890a2d7cd33827"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.703483 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c67fc55d6-grhds" event={"ID":"115995c2-39bf-4d60-bcf9-ca342384137a","Type":"ContainerStarted","Data":"5a3a22fd985a533ef6ef4c07624fbc62f772bef8ede893da1d0f708c227a111a"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.704164 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.704200 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.713009 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-95f789cff-nbpm9" event={"ID":"527588f6-952d-4f9c-990c-775b34d48d78","Type":"ContainerStarted","Data":"76a5b00f7bb75ff78fffe507b21a27fa3e091d4306dcaf62ccb294ffea6cf2a2"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.713052 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-95f789cff-nbpm9" event={"ID":"527588f6-952d-4f9c-990c-775b34d48d78","Type":"ContainerStarted","Data":"13a36b784fbc3258f7b79e52f64527eafa50344d7a1b65dcfe4093229d01fa8d"} Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.760501 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6564bc679b-dbsbx" podStartSLOduration=3.9165317440000003 podStartE2EDuration="9.760479321s" podCreationTimestamp="2025-12-05 06:12:37 +0000 UTC" firstStartedPulling="2025-12-05 06:12:39.813047933 +0000 UTC m=+1179.093059155" lastFinishedPulling="2025-12-05 06:12:45.65699551 +0000 UTC m=+1184.937006732" observedRunningTime="2025-12-05 06:12:46.747150201 +0000 UTC m=+1186.027161423" watchObservedRunningTime="2025-12-05 06:12:46.760479321 +0000 UTC m=+1186.040490543" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.780794 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-95f789cff-nbpm9" podStartSLOduration=4.06851827 podStartE2EDuration="9.780770208s" podCreationTimestamp="2025-12-05 06:12:37 +0000 UTC" firstStartedPulling="2025-12-05 06:12:39.976881026 +0000 UTC m=+1179.256892248" lastFinishedPulling="2025-12-05 06:12:45.689132964 +0000 UTC m=+1184.969144186" observedRunningTime="2025-12-05 06:12:46.771091993 +0000 UTC m=+1186.051103215" watchObservedRunningTime="2025-12-05 06:12:46.780770208 +0000 UTC m=+1186.060781440" Dec 05 06:12:46 crc kubenswrapper[4865]: I1205 06:12:46.814238 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c67fc55d6-grhds" podStartSLOduration=5.81421417 podStartE2EDuration="5.81421417s" podCreationTimestamp="2025-12-05 06:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:12:46.803352311 +0000 UTC m=+1186.083363533" watchObservedRunningTime="2025-12-05 06:12:46.81421417 +0000 UTC m=+1186.094225382" Dec 05 06:12:47 crc kubenswrapper[4865]: I1205 06:12:47.663000 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:47 crc kubenswrapper[4865]: I1205 06:12:47.663006 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:47 crc kubenswrapper[4865]: I1205 06:12:47.734969 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:47 crc kubenswrapper[4865]: I1205 06:12:47.737552 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:12:47 crc kubenswrapper[4865]: E1205 06:12:47.738208 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-595958cf6d-v6bd6_openstack(b1eecac4-4210-4a9d-9d8a-bcf21327c712)\"" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.654450 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.663351 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.663862 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.747443 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mfzbq"] Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.747688 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="dnsmasq-dns" containerID="cri-o://052f17d7dd4d5eed035c98d02e3d8a4d7ff7ec6304471171514c3b6d3e20fbab" gracePeriod=10 Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.753924 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:12:48 crc kubenswrapper[4865]: E1205 06:12:48.754167 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-595958cf6d-v6bd6_openstack(b1eecac4-4210-4a9d-9d8a-bcf21327c712)\"" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" Dec 05 06:12:48 crc kubenswrapper[4865]: I1205 06:12:48.757247 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:49 crc kubenswrapper[4865]: I1205 06:12:49.765554 4865 generic.go:334] "Generic (PLEG): container finished" podID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerID="052f17d7dd4d5eed035c98d02e3d8a4d7ff7ec6304471171514c3b6d3e20fbab" exitCode=0 Dec 05 06:12:49 crc kubenswrapper[4865]: I1205 06:12:49.765920 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" event={"ID":"62b75a9f-3535-47bf-8874-6ef496fc894d","Type":"ContainerDied","Data":"052f17d7dd4d5eed035c98d02e3d8a4d7ff7ec6304471171514c3b6d3e20fbab"} Dec 05 06:12:49 crc kubenswrapper[4865]: I1205 06:12:49.766618 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:12:49 crc kubenswrapper[4865]: I1205 06:12:49.766683 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:49 crc kubenswrapper[4865]: E1205 06:12:49.766881 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-595958cf6d-v6bd6_openstack(b1eecac4-4210-4a9d-9d8a-bcf21327c712)\"" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" Dec 05 06:12:50 crc kubenswrapper[4865]: I1205 06:12:50.663701 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:51 crc kubenswrapper[4865]: I1205 06:12:51.156959 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:12:51 crc kubenswrapper[4865]: I1205 06:12:51.300332 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78c59b79fd-5jlv4" podUID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 06:12:51 crc kubenswrapper[4865]: E1205 06:12:51.929802 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e4dce7_c9e7_4813_a957_1df502644792.slice/crio-a9145a9a4c3cb9abfd5ddb8a4ccea3b5c568087506ed84014d0eb1eb2681eb5c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 06:12:52 crc kubenswrapper[4865]: I1205 06:12:52.511339 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: connect: connection refused" Dec 05 06:12:52 crc kubenswrapper[4865]: I1205 06:12:52.838671 4865 generic.go:334] "Generic (PLEG): container finished" podID="b5e4dce7-c9e7-4813-a957-1df502644792" containerID="a9145a9a4c3cb9abfd5ddb8a4ccea3b5c568087506ed84014d0eb1eb2681eb5c" exitCode=0 Dec 05 06:12:52 crc kubenswrapper[4865]: I1205 06:12:52.838747 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7d5b" event={"ID":"b5e4dce7-c9e7-4813-a957-1df502644792","Type":"ContainerDied","Data":"a9145a9a4c3cb9abfd5ddb8a4ccea3b5c568087506ed84014d0eb1eb2681eb5c"} Dec 05 06:12:52 crc kubenswrapper[4865]: I1205 06:12:52.886347 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.663156 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.663479 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.663157 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.664242 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="barbican-api-log" containerStatusID={"Type":"cri-o","ID":"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793"} pod="openstack/barbican-api-595958cf6d-v6bd6" containerMessage="Container barbican-api-log failed liveness probe, will be restarted" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.664266 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.664290 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" containerID="cri-o://228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793" gracePeriod=30 Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.666125 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.860632 4865 generic.go:334] "Generic (PLEG): container finished" podID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerID="228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793" exitCode=143 Dec 05 06:12:53 crc kubenswrapper[4865]: I1205 06:12:53.860713 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerDied","Data":"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793"} Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.844790 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945346 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-db-sync-config-data\") pod \"b5e4dce7-c9e7-4813-a957-1df502644792\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945391 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7d5b" event={"ID":"b5e4dce7-c9e7-4813-a957-1df502644792","Type":"ContainerDied","Data":"bf41e3f47c78f15d8a83a70ed918b306484e815d568941424cec5a9cb9093ae9"} Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945458 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf41e3f47c78f15d8a83a70ed918b306484e815d568941424cec5a9cb9093ae9" Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945512 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-config-data\") pod \"b5e4dce7-c9e7-4813-a957-1df502644792\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945546 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-scripts\") pod \"b5e4dce7-c9e7-4813-a957-1df502644792\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945567 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7d5b" Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945611 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh5bv\" (UniqueName: \"kubernetes.io/projected/b5e4dce7-c9e7-4813-a957-1df502644792-kube-api-access-kh5bv\") pod \"b5e4dce7-c9e7-4813-a957-1df502644792\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945674 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-combined-ca-bundle\") pod \"b5e4dce7-c9e7-4813-a957-1df502644792\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.945776 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5e4dce7-c9e7-4813-a957-1df502644792-etc-machine-id\") pod \"b5e4dce7-c9e7-4813-a957-1df502644792\" (UID: \"b5e4dce7-c9e7-4813-a957-1df502644792\") " Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.947905 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5e4dce7-c9e7-4813-a957-1df502644792-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b5e4dce7-c9e7-4813-a957-1df502644792" (UID: "b5e4dce7-c9e7-4813-a957-1df502644792"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.957263 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-scripts" (OuterVolumeSpecName: "scripts") pod "b5e4dce7-c9e7-4813-a957-1df502644792" (UID: "b5e4dce7-c9e7-4813-a957-1df502644792"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.970043 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b5e4dce7-c9e7-4813-a957-1df502644792" (UID: "b5e4dce7-c9e7-4813-a957-1df502644792"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:54 crc kubenswrapper[4865]: I1205 06:12:54.972976 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e4dce7-c9e7-4813-a957-1df502644792-kube-api-access-kh5bv" (OuterVolumeSpecName: "kube-api-access-kh5bv") pod "b5e4dce7-c9e7-4813-a957-1df502644792" (UID: "b5e4dce7-c9e7-4813-a957-1df502644792"). InnerVolumeSpecName "kube-api-access-kh5bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.004992 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5e4dce7-c9e7-4813-a957-1df502644792" (UID: "b5e4dce7-c9e7-4813-a957-1df502644792"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.057541 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.057589 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh5bv\" (UniqueName: \"kubernetes.io/projected/b5e4dce7-c9e7-4813-a957-1df502644792-kube-api-access-kh5bv\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.057600 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.057609 4865 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5e4dce7-c9e7-4813-a957-1df502644792-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.057617 4865 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.071622 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-config-data" (OuterVolumeSpecName: "config-data") pod "b5e4dce7-c9e7-4813-a957-1df502644792" (UID: "b5e4dce7-c9e7-4813-a957-1df502644792"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.159065 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e4dce7-c9e7-4813-a957-1df502644792-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.243257 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-cc4q4"] Dec 05 06:12:55 crc kubenswrapper[4865]: E1205 06:12:55.243692 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e4dce7-c9e7-4813-a957-1df502644792" containerName="cinder-db-sync" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.243709 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e4dce7-c9e7-4813-a957-1df502644792" containerName="cinder-db-sync" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.243909 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e4dce7-c9e7-4813-a957-1df502644792" containerName="cinder-db-sync" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.244976 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.256724 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.258752 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.271040 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-cc4q4"] Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.280763 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pgfdn" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.280853 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.280898 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.280765 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.304194 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388399 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388445 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388499 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-scripts\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388543 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45262909-db71-460b-9e2d-2fcc4ae45748-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388564 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-config\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388590 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388608 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcm9d\" (UniqueName: \"kubernetes.io/projected/45262909-db71-460b-9e2d-2fcc4ae45748-kube-api-access-bcm9d\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388634 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388652 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388794 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388843 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hftz2\" (UniqueName: \"kubernetes.io/projected/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-kube-api-access-hftz2\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.388888 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.490456 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.490508 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.491663 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499027 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-scripts\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499159 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45262909-db71-460b-9e2d-2fcc4ae45748-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499208 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-config\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499259 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499298 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcm9d\" (UniqueName: \"kubernetes.io/projected/45262909-db71-460b-9e2d-2fcc4ae45748-kube-api-access-bcm9d\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499363 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499387 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499467 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499511 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hftz2\" (UniqueName: \"kubernetes.io/projected/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-kube-api-access-hftz2\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.499609 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.500438 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.501680 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45262909-db71-460b-9e2d-2fcc4ae45748-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.502370 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-config\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.502545 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.503103 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.510590 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-scripts\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.511269 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.511888 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.512184 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.513484 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.534613 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.535144 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.537470 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcm9d\" (UniqueName: \"kubernetes.io/projected/45262909-db71-460b-9e2d-2fcc4ae45748-kube-api-access-bcm9d\") pod \"cinder-scheduler-0\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.548846 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hftz2\" (UniqueName: \"kubernetes.io/projected/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-kube-api-access-hftz2\") pod \"dnsmasq-dns-6bb4fc677f-cc4q4\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.553399 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602265 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-scripts\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602312 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602364 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602431 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/152d76e8-934b-43df-af0a-892c08425ef8-logs\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602476 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data-custom\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602505 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/152d76e8-934b-43df-af0a-892c08425ef8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.602588 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9cc\" (UniqueName: \"kubernetes.io/projected/152d76e8-934b-43df-af0a-892c08425ef8-kube-api-access-zk9cc\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.612346 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.633282 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704424 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/152d76e8-934b-43df-af0a-892c08425ef8-logs\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704494 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data-custom\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704520 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/152d76e8-934b-43df-af0a-892c08425ef8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704576 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9cc\" (UniqueName: \"kubernetes.io/projected/152d76e8-934b-43df-af0a-892c08425ef8-kube-api-access-zk9cc\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704620 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-scripts\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704635 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.704665 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.705020 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/152d76e8-934b-43df-af0a-892c08425ef8-logs\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.705091 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/152d76e8-934b-43df-af0a-892c08425ef8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.712519 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.712535 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data-custom\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.713792 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.717297 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-scripts\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.732120 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk9cc\" (UniqueName: \"kubernetes.io/projected/152d76e8-934b-43df-af0a-892c08425ef8-kube-api-access-zk9cc\") pod \"cinder-api-0\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.867317 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 06:12:55 crc kubenswrapper[4865]: I1205 06:12:55.927303 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.136788 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67df96fc59-crcwg" Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.215726 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f79fcff88-45kgr"] Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.215977 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f79fcff88-45kgr" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-api" containerID="cri-o://c89de8fc3ea0e65295dce7889cb2efc05c7c869baa641ccded41055166dae720" gracePeriod=30 Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.217601 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f79fcff88-45kgr" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-httpd" containerID="cri-o://e3628b4b79584bb90b59b4cb9baadea29c22e95a83c5a7989fb8be1a18ede308" gracePeriod=30 Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.488143 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c67fc55d6-grhds" Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.561430 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-595958cf6d-v6bd6"] Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.965472 4865 generic.go:334] "Generic (PLEG): container finished" podID="01721db4-0a32-46e7-a617-4f7369599b6e" containerID="e3628b4b79584bb90b59b4cb9baadea29c22e95a83c5a7989fb8be1a18ede308" exitCode=0 Dec 05 06:12:56 crc kubenswrapper[4865]: I1205 06:12:56.965759 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f79fcff88-45kgr" event={"ID":"01721db4-0a32-46e7-a617-4f7369599b6e","Type":"ContainerDied","Data":"e3628b4b79584bb90b59b4cb9baadea29c22e95a83c5a7989fb8be1a18ede308"} Dec 05 06:12:57 crc kubenswrapper[4865]: I1205 06:12:57.511451 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: connect: connection refused" Dec 05 06:12:58 crc kubenswrapper[4865]: I1205 06:12:58.360520 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:12:58 crc kubenswrapper[4865]: I1205 06:12:58.663322 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Dec 05 06:12:58 crc kubenswrapper[4865]: E1205 06:12:58.988460 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 05 06:12:58 crc kubenswrapper[4865]: E1205 06:12:58.989667 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn2st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1c79c96f-7385-43dc-8ceb-ac1bf6de7a66): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 06:12:58 crc kubenswrapper[4865]: E1205 06:12:58.990992 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.022452 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="ceilometer-notification-agent" containerID="cri-o://260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b" gracePeriod=30 Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.023754 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="sg-core" containerID="cri-o://a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d" gracePeriod=30 Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.457168 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.554485 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-c67fc55d6-grhds" podUID="115995c2-39bf-4d60-bcf9-ca342384137a" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.593494 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkqbh\" (UniqueName: \"kubernetes.io/projected/62b75a9f-3535-47bf-8874-6ef496fc894d-kube-api-access-qkqbh\") pod \"62b75a9f-3535-47bf-8874-6ef496fc894d\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.593603 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-svc\") pod \"62b75a9f-3535-47bf-8874-6ef496fc894d\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.593633 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-nb\") pod \"62b75a9f-3535-47bf-8874-6ef496fc894d\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.593705 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-config\") pod \"62b75a9f-3535-47bf-8874-6ef496fc894d\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.593837 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-sb\") pod \"62b75a9f-3535-47bf-8874-6ef496fc894d\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.593926 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-swift-storage-0\") pod \"62b75a9f-3535-47bf-8874-6ef496fc894d\" (UID: \"62b75a9f-3535-47bf-8874-6ef496fc894d\") " Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.608306 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b75a9f-3535-47bf-8874-6ef496fc894d-kube-api-access-qkqbh" (OuterVolumeSpecName: "kube-api-access-qkqbh") pod "62b75a9f-3535-47bf-8874-6ef496fc894d" (UID: "62b75a9f-3535-47bf-8874-6ef496fc894d"). InnerVolumeSpecName "kube-api-access-qkqbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.703072 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkqbh\" (UniqueName: \"kubernetes.io/projected/62b75a9f-3535-47bf-8874-6ef496fc894d-kube-api-access-qkqbh\") on node \"crc\" DevicePath \"\"" Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.770217 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-cc4q4"] Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.984683 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62b75a9f-3535-47bf-8874-6ef496fc894d" (UID: "62b75a9f-3535-47bf-8874-6ef496fc894d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:12:59 crc kubenswrapper[4865]: I1205 06:12:59.991300 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "62b75a9f-3535-47bf-8874-6ef496fc894d" (UID: "62b75a9f-3535-47bf-8874-6ef496fc894d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.005307 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.052543 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62b75a9f-3535-47bf-8874-6ef496fc894d" (UID: "62b75a9f-3535-47bf-8874-6ef496fc894d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.056976 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.058774 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.058871 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.058932 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.140718 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-config" (OuterVolumeSpecName: "config") pod "62b75a9f-3535-47bf-8874-6ef496fc894d" (UID: "62b75a9f-3535-47bf-8874-6ef496fc894d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.147266 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62b75a9f-3535-47bf-8874-6ef496fc894d" (UID: "62b75a9f-3535-47bf-8874-6ef496fc894d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.164609 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.164647 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b75a9f-3535-47bf-8874-6ef496fc894d-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.166713 4865 generic.go:334] "Generic (PLEG): container finished" podID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerID="a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d" exitCode=2 Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.166780 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66","Type":"ContainerDied","Data":"a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d"} Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.218684 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" event={"ID":"62b75a9f-3535-47bf-8874-6ef496fc894d","Type":"ContainerDied","Data":"3433efb53501752eddcaec701019820a9e1edaf61b56535c60292f7ccf7d22b3"} Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.218726 4865 scope.go:117] "RemoveContainer" containerID="052f17d7dd4d5eed035c98d02e3d8a4d7ff7ec6304471171514c3b6d3e20fbab" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.218846 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-mfzbq" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.282385 4865 generic.go:334] "Generic (PLEG): container finished" podID="01721db4-0a32-46e7-a617-4f7369599b6e" containerID="c89de8fc3ea0e65295dce7889cb2efc05c7c869baa641ccded41055166dae720" exitCode=0 Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.282455 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f79fcff88-45kgr" event={"ID":"01721db4-0a32-46e7-a617-4f7369599b6e","Type":"ContainerDied","Data":"c89de8fc3ea0e65295dce7889cb2efc05c7c869baa641ccded41055166dae720"} Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.288054 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.288946 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" event={"ID":"b934cf19-1e79-4b97-bf09-8af1cb89d6d5","Type":"ContainerStarted","Data":"12b553a7dfb4ab11be5b3ebc373fa25160aec99f93ea307901255ce58c9382fb"} Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.302902 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mfzbq"] Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.328086 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-mfzbq"] Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.374519 4865 scope.go:117] "RemoveContainer" containerID="0e3e6180f439fcc8142a5ed3a65119081f4d9961b8cb401e68e2512cd729e75e" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.503085 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-combined-ca-bundle\") pod \"01721db4-0a32-46e7-a617-4f7369599b6e\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.503171 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-config\") pod \"01721db4-0a32-46e7-a617-4f7369599b6e\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.503241 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kmg9\" (UniqueName: \"kubernetes.io/projected/01721db4-0a32-46e7-a617-4f7369599b6e-kube-api-access-5kmg9\") pod \"01721db4-0a32-46e7-a617-4f7369599b6e\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.503331 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-ovndb-tls-certs\") pod \"01721db4-0a32-46e7-a617-4f7369599b6e\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.503389 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-httpd-config\") pod \"01721db4-0a32-46e7-a617-4f7369599b6e\" (UID: \"01721db4-0a32-46e7-a617-4f7369599b6e\") " Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.529175 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01721db4-0a32-46e7-a617-4f7369599b6e-kube-api-access-5kmg9" (OuterVolumeSpecName: "kube-api-access-5kmg9") pod "01721db4-0a32-46e7-a617-4f7369599b6e" (UID: "01721db4-0a32-46e7-a617-4f7369599b6e"). InnerVolumeSpecName "kube-api-access-5kmg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.543632 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "01721db4-0a32-46e7-a617-4f7369599b6e" (UID: "01721db4-0a32-46e7-a617-4f7369599b6e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.606929 4865 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.607057 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kmg9\" (UniqueName: \"kubernetes.io/projected/01721db4-0a32-46e7-a617-4f7369599b6e-kube-api-access-5kmg9\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.755225 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-config" (OuterVolumeSpecName: "config") pod "01721db4-0a32-46e7-a617-4f7369599b6e" (UID: "01721db4-0a32-46e7-a617-4f7369599b6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.810354 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01721db4-0a32-46e7-a617-4f7369599b6e" (UID: "01721db4-0a32-46e7-a617-4f7369599b6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.813117 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.813295 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.869708 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "01721db4-0a32-46e7-a617-4f7369599b6e" (UID: "01721db4-0a32-46e7-a617-4f7369599b6e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:00 crc kubenswrapper[4865]: I1205 06:13:00.916809 4865 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01721db4-0a32-46e7-a617-4f7369599b6e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.034834 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" path="/var/lib/kubelet/pods/62b75a9f-3535-47bf-8874-6ef496fc894d/volumes" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.156345 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.156409 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.157381 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"ae9988b24b0cc529f27a61e58c049c77ec8edcedb21946f5111a11587be650d1"} pod="openstack/horizon-bd68dd9b8-z62zt" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.157423 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" containerID="cri-o://ae9988b24b0cc529f27a61e58c049c77ec8edcedb21946f5111a11587be650d1" gracePeriod=30 Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.299440 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78c59b79fd-5jlv4" podUID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.299528 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.300293 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"2c566b0fa9fad49639e8ef5098b129e44f8c6799cb1513e54a1766170e2190fd"} pod="openstack/horizon-78c59b79fd-5jlv4" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.300334 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78c59b79fd-5jlv4" podUID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerName="horizon" containerID="cri-o://2c566b0fa9fad49639e8ef5098b129e44f8c6799cb1513e54a1766170e2190fd" gracePeriod=30 Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.305259 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f79fcff88-45kgr" event={"ID":"01721db4-0a32-46e7-a617-4f7369599b6e","Type":"ContainerDied","Data":"84ec100765c2edcd94d28c8ca76ea2c68256d8c94ff82afcfd8cf548e4c54e8a"} Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.305512 4865 scope.go:117] "RemoveContainer" containerID="e3628b4b79584bb90b59b4cb9baadea29c22e95a83c5a7989fb8be1a18ede308" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.305766 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f79fcff88-45kgr" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.324640 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"152d76e8-934b-43df-af0a-892c08425ef8","Type":"ContainerStarted","Data":"ea9efb8657c27f6542d3d2c5a5c94248dfbc1a311c88a1e47712a3e2f1226a6a"} Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.332521 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" event={"ID":"b934cf19-1e79-4b97-bf09-8af1cb89d6d5","Type":"ContainerStarted","Data":"652c47eb401454a63b9cd27031a6b25376cf01e022683cb7909e7aaeab604c2f"} Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.337980 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerStarted","Data":"281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf"} Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.338009 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerStarted","Data":"d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321"} Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.338166 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" containerID="cri-o://281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf" gracePeriod=30 Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.338272 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.338299 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.338627 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-595958cf6d-v6bd6" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" containerID="cri-o://d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321" gracePeriod=30 Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.344016 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f79fcff88-45kgr"] Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.346582 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45262909-db71-460b-9e2d-2fcc4ae45748","Type":"ContainerStarted","Data":"97c37d0a60f865cc6ceb26f58c5e8688eca75294351ced100815e0ec2d41ac02"} Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.354250 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f79fcff88-45kgr"] Dec 05 06:13:01 crc kubenswrapper[4865]: I1205 06:13:01.597664 4865 scope.go:117] "RemoveContainer" containerID="c89de8fc3ea0e65295dce7889cb2efc05c7c869baa641ccded41055166dae720" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.033684 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.053531 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-sg-core-conf-yaml\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.053787 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-scripts\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.053908 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-combined-ca-bundle\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.053979 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-log-httpd\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.054086 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2st\" (UniqueName: \"kubernetes.io/projected/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-kube-api-access-sn2st\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.057449 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.057569 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-config-data\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.057672 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-run-httpd\") pod \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\" (UID: \"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.058698 4865 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.061554 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-kube-api-access-sn2st" (OuterVolumeSpecName: "kube-api-access-sn2st") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "kube-api-access-sn2st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.061707 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.119812 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.137555 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-scripts" (OuterVolumeSpecName: "scripts") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.137722 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.140283 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-config-data" (OuterVolumeSpecName: "config-data") pod "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" (UID: "1c79c96f-7385-43dc-8ceb-ac1bf6de7a66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.171527 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.177228 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.178858 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2st\" (UniqueName: \"kubernetes.io/projected/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-kube-api-access-sn2st\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.178993 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.179056 4865 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.179130 4865 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.363630 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"152d76e8-934b-43df-af0a-892c08425ef8","Type":"ContainerStarted","Data":"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50"} Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.368502 4865 generic.go:334] "Generic (PLEG): container finished" podID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerID="652c47eb401454a63b9cd27031a6b25376cf01e022683cb7909e7aaeab604c2f" exitCode=0 Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.368575 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" event={"ID":"b934cf19-1e79-4b97-bf09-8af1cb89d6d5","Type":"ContainerDied","Data":"652c47eb401454a63b9cd27031a6b25376cf01e022683cb7909e7aaeab604c2f"} Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.406543 4865 generic.go:334] "Generic (PLEG): container finished" podID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerID="260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b" exitCode=0 Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.406601 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66","Type":"ContainerDied","Data":"260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b"} Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.406629 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c79c96f-7385-43dc-8ceb-ac1bf6de7a66","Type":"ContainerDied","Data":"3d6770735365fde5612578a66e65f2704b249d8593b8fc976465edc859faa9b6"} Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.406647 4865 scope.go:117] "RemoveContainer" containerID="a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.406786 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.412291 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.413376 4865 generic.go:334] "Generic (PLEG): container finished" podID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerID="d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321" exitCode=1 Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.413394 4865 generic.go:334] "Generic (PLEG): container finished" podID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerID="281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf" exitCode=143 Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.413412 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerDied","Data":"d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321"} Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.413430 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerDied","Data":"281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf"} Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.474642 4865 scope.go:117] "RemoveContainer" containerID="260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.483343 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data\") pod \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.483394 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lmhv\" (UniqueName: \"kubernetes.io/projected/b1eecac4-4210-4a9d-9d8a-bcf21327c712-kube-api-access-9lmhv\") pod \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.483446 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1eecac4-4210-4a9d-9d8a-bcf21327c712-logs\") pod \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.483542 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-combined-ca-bundle\") pod \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.483652 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data-custom\") pod \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\" (UID: \"b1eecac4-4210-4a9d-9d8a-bcf21327c712\") " Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.496804 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1eecac4-4210-4a9d-9d8a-bcf21327c712-logs" (OuterVolumeSpecName: "logs") pod "b1eecac4-4210-4a9d-9d8a-bcf21327c712" (UID: "b1eecac4-4210-4a9d-9d8a-bcf21327c712"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.497018 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1eecac4-4210-4a9d-9d8a-bcf21327c712-kube-api-access-9lmhv" (OuterVolumeSpecName: "kube-api-access-9lmhv") pod "b1eecac4-4210-4a9d-9d8a-bcf21327c712" (UID: "b1eecac4-4210-4a9d-9d8a-bcf21327c712"). InnerVolumeSpecName "kube-api-access-9lmhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.502436 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1eecac4-4210-4a9d-9d8a-bcf21327c712" (UID: "b1eecac4-4210-4a9d-9d8a-bcf21327c712"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.541244 4865 scope.go:117] "RemoveContainer" containerID="a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.552730 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d\": container with ID starting with a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d not found: ID does not exist" containerID="a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.552768 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d"} err="failed to get container status \"a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d\": rpc error: code = NotFound desc = could not find container \"a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d\": container with ID starting with a1494eed2d1b4355ec4d7a369f30b6a3f7ce01b64f50182ed648191895a4042d not found: ID does not exist" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.552797 4865 scope.go:117] "RemoveContainer" containerID="260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.556870 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.557690 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b\": container with ID starting with 260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b not found: ID does not exist" containerID="260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.559063 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b"} err="failed to get container status \"260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b\": rpc error: code = NotFound desc = could not find container \"260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b\": container with ID starting with 260075ad4d4f651121f83eff1321b949610ecae3c506deb3e0c299c03e016b7b not found: ID does not exist" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.560451 4865 scope.go:117] "RemoveContainer" containerID="d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.566038 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582236 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582671 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582689 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582710 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582717 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582725 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="dnsmasq-dns" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582730 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="dnsmasq-dns" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582749 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="sg-core" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582754 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="sg-core" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582766 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="init" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582772 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="init" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582786 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582792 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582798 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582804 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582829 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-httpd" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582835 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-httpd" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582843 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="ceilometer-notification-agent" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582849 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="ceilometer-notification-agent" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.582861 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.582867 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583030 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583043 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583051 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583061 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="sg-core" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583072 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" containerName="ceilometer-notification-agent" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583082 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-httpd" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583091 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" containerName="neutron-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583102 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b75a9f-3535-47bf-8874-6ef496fc894d" containerName="dnsmasq-dns" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.583271 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583279 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583441 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.583464 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" containerName="barbican-api-log" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.584794 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.587624 4865 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.587656 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lmhv\" (UniqueName: \"kubernetes.io/projected/b1eecac4-4210-4a9d-9d8a-bcf21327c712-kube-api-access-9lmhv\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.587668 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1eecac4-4210-4a9d-9d8a-bcf21327c712-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.592434 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.592924 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.607125 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.644787 4865 scope.go:117] "RemoveContainer" containerID="281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.655170 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1eecac4-4210-4a9d-9d8a-bcf21327c712" (UID: "b1eecac4-4210-4a9d-9d8a-bcf21327c712"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.673288 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.689655 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.689712 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-config-data\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.689736 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66jvh\" (UniqueName: \"kubernetes.io/projected/4e1e4be9-e191-4245-b107-81e7ea608c7c-kube-api-access-66jvh\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690066 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data" (OuterVolumeSpecName: "config-data") pod "b1eecac4-4210-4a9d-9d8a-bcf21327c712" (UID: "b1eecac4-4210-4a9d-9d8a-bcf21327c712"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690102 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-run-httpd\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690172 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690235 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-log-httpd\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690323 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-scripts\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690396 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.690410 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1eecac4-4210-4a9d-9d8a-bcf21327c712-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.727065 4865 scope.go:117] "RemoveContainer" containerID="228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.754206 4865 scope.go:117] "RemoveContainer" containerID="d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.754813 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321\": container with ID starting with d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321 not found: ID does not exist" containerID="d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.754854 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321"} err="failed to get container status \"d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321\": rpc error: code = NotFound desc = could not find container \"d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321\": container with ID starting with d10f90d968778c7df434e37e20867f68b81482405287adea323d0c28b6095321 not found: ID does not exist" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.754877 4865 scope.go:117] "RemoveContainer" containerID="281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.755278 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf\": container with ID starting with 281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf not found: ID does not exist" containerID="281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.755304 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf"} err="failed to get container status \"281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf\": rpc error: code = NotFound desc = could not find container \"281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf\": container with ID starting with 281b24b09c94f88179f1f7ff9763d9c85af047eb75919a76b16e34bbe0e895cf not found: ID does not exist" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.755327 4865 scope.go:117] "RemoveContainer" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.755669 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790\": container with ID starting with 65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790 not found: ID does not exist" containerID="65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.755707 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790"} err="failed to get container status \"65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790\": rpc error: code = NotFound desc = could not find container \"65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790\": container with ID starting with 65b241205b5ca975f2be3cab1a17f9491c61d898e83899f547e5c1acada1c790 not found: ID does not exist" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.755724 4865 scope.go:117] "RemoveContainer" containerID="228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793" Dec 05 06:13:02 crc kubenswrapper[4865]: E1205 06:13:02.756020 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793\": container with ID starting with 228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793 not found: ID does not exist" containerID="228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.756046 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793"} err="failed to get container status \"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793\": rpc error: code = NotFound desc = could not find container \"228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793\": container with ID starting with 228bf3b63facdaeef9d5e1438f980af4843972de4fed62596836d1ea133b9793 not found: ID does not exist" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792188 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-run-httpd\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792435 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792460 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-log-httpd\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792502 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-scripts\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792548 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792580 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-config-data\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.792597 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66jvh\" (UniqueName: \"kubernetes.io/projected/4e1e4be9-e191-4245-b107-81e7ea608c7c-kube-api-access-66jvh\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.793125 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-run-httpd\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.794366 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-log-httpd\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.798642 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.799180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.801447 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-scripts\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.801873 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-config-data\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.810621 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66jvh\" (UniqueName: \"kubernetes.io/projected/4e1e4be9-e191-4245-b107-81e7ea608c7c-kube-api-access-66jvh\") pod \"ceilometer-0\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " pod="openstack/ceilometer-0" Dec 05 06:13:02 crc kubenswrapper[4865]: I1205 06:13:02.914482 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.030597 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01721db4-0a32-46e7-a617-4f7369599b6e" path="/var/lib/kubelet/pods/01721db4-0a32-46e7-a617-4f7369599b6e/volumes" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.031230 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c79c96f-7385-43dc-8ceb-ac1bf6de7a66" path="/var/lib/kubelet/pods/1c79c96f-7385-43dc-8ceb-ac1bf6de7a66/volumes" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.424118 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45262909-db71-460b-9e2d-2fcc4ae45748","Type":"ContainerStarted","Data":"c8d6bffd36a1cd4ea8db3d2077e4bf16c21610b6d3550a995d678ca673f48fb4"} Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.426492 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"152d76e8-934b-43df-af0a-892c08425ef8","Type":"ContainerStarted","Data":"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743"} Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.426609 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api-log" containerID="cri-o://0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50" gracePeriod=30 Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.426682 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.426969 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api" containerID="cri-o://ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743" gracePeriod=30 Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.431259 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" event={"ID":"b934cf19-1e79-4b97-bf09-8af1cb89d6d5","Type":"ContainerStarted","Data":"733f94fcf8a137a64516bce1be36da4adea848cac537b1c468db642ae012d9f3"} Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.432378 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.435509 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-595958cf6d-v6bd6" event={"ID":"b1eecac4-4210-4a9d-9d8a-bcf21327c712","Type":"ContainerDied","Data":"ab6870366cfc39dc3a6c278c682e40ce16fb286b8f2fe4e2f84ab8c557ceef9f"} Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.435580 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-595958cf6d-v6bd6" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.458034 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.458012912 podStartE2EDuration="8.458012912s" podCreationTimestamp="2025-12-05 06:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:03.450049515 +0000 UTC m=+1202.730060737" watchObservedRunningTime="2025-12-05 06:13:03.458012912 +0000 UTC m=+1202.738024134" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.486893 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" podStartSLOduration=8.486796741 podStartE2EDuration="8.486796741s" podCreationTimestamp="2025-12-05 06:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:03.471093584 +0000 UTC m=+1202.751104816" watchObservedRunningTime="2025-12-05 06:13:03.486796741 +0000 UTC m=+1202.766807963" Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.501116 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-595958cf6d-v6bd6"] Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.508431 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-595958cf6d-v6bd6"] Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.522561 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:03 crc kubenswrapper[4865]: I1205 06:13:03.890707 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.014607 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk9cc\" (UniqueName: \"kubernetes.io/projected/152d76e8-934b-43df-af0a-892c08425ef8-kube-api-access-zk9cc\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.015461 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-combined-ca-bundle\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.015538 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/152d76e8-934b-43df-af0a-892c08425ef8-etc-machine-id\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.015561 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-scripts\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.015599 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data-custom\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.015616 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/152d76e8-934b-43df-af0a-892c08425ef8-logs\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.015646 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data\") pod \"152d76e8-934b-43df-af0a-892c08425ef8\" (UID: \"152d76e8-934b-43df-af0a-892c08425ef8\") " Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.022203 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/152d76e8-934b-43df-af0a-892c08425ef8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.022552 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152d76e8-934b-43df-af0a-892c08425ef8-kube-api-access-zk9cc" (OuterVolumeSpecName: "kube-api-access-zk9cc") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "kube-api-access-zk9cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.022796 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152d76e8-934b-43df-af0a-892c08425ef8-logs" (OuterVolumeSpecName: "logs") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.027199 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-scripts" (OuterVolumeSpecName: "scripts") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.027755 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.076367 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.076995 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data" (OuterVolumeSpecName: "config-data") pod "152d76e8-934b-43df-af0a-892c08425ef8" (UID: "152d76e8-934b-43df-af0a-892c08425ef8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118012 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk9cc\" (UniqueName: \"kubernetes.io/projected/152d76e8-934b-43df-af0a-892c08425ef8-kube-api-access-zk9cc\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118047 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118056 4865 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/152d76e8-934b-43df-af0a-892c08425ef8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118066 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118077 4865 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118085 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/152d76e8-934b-43df-af0a-892c08425ef8-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.118093 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d76e8-934b-43df-af0a-892c08425ef8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.443958 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45262909-db71-460b-9e2d-2fcc4ae45748","Type":"ContainerStarted","Data":"a6b05f7b5b392079e02325575cf4d7a615c0e0ed18646779c0c9faa1e6178acd"} Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447305 4865 generic.go:334] "Generic (PLEG): container finished" podID="152d76e8-934b-43df-af0a-892c08425ef8" containerID="ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743" exitCode=0 Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447342 4865 generic.go:334] "Generic (PLEG): container finished" podID="152d76e8-934b-43df-af0a-892c08425ef8" containerID="0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50" exitCode=143 Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447378 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"152d76e8-934b-43df-af0a-892c08425ef8","Type":"ContainerDied","Data":"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743"} Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447404 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"152d76e8-934b-43df-af0a-892c08425ef8","Type":"ContainerDied","Data":"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50"} Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447415 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"152d76e8-934b-43df-af0a-892c08425ef8","Type":"ContainerDied","Data":"ea9efb8657c27f6542d3d2c5a5c94248dfbc1a311c88a1e47712a3e2f1226a6a"} Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447434 4865 scope.go:117] "RemoveContainer" containerID="ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.447551 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.454597 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerStarted","Data":"80baee471c7b57386fc3c1131518875c6cab6dbb55b94445f72ca24cd853d82f"} Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.454638 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerStarted","Data":"851740a7a1de52247f4b0c2945b81c6075a7e147855f123b5bec5720ab704cfb"} Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.475935 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.023366129 podStartE2EDuration="9.475911506s" podCreationTimestamp="2025-12-05 06:12:55 +0000 UTC" firstStartedPulling="2025-12-05 06:13:00.192403447 +0000 UTC m=+1199.472414669" lastFinishedPulling="2025-12-05 06:13:01.644948824 +0000 UTC m=+1200.924960046" observedRunningTime="2025-12-05 06:13:04.474302721 +0000 UTC m=+1203.754313943" watchObservedRunningTime="2025-12-05 06:13:04.475911506 +0000 UTC m=+1203.755922728" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.498862 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.510867 4865 scope.go:117] "RemoveContainer" containerID="0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.522295 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.538999 4865 scope.go:117] "RemoveContainer" containerID="ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743" Dec 05 06:13:04 crc kubenswrapper[4865]: E1205 06:13:04.539471 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743\": container with ID starting with ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743 not found: ID does not exist" containerID="ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.539497 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743"} err="failed to get container status \"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743\": rpc error: code = NotFound desc = could not find container \"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743\": container with ID starting with ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743 not found: ID does not exist" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.539520 4865 scope.go:117] "RemoveContainer" containerID="0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.539570 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:13:04 crc kubenswrapper[4865]: E1205 06:13:04.539989 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.540003 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api" Dec 05 06:13:04 crc kubenswrapper[4865]: E1205 06:13:04.540011 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api-log" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.540017 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api-log" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.540218 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api-log" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.540235 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="152d76e8-934b-43df-af0a-892c08425ef8" containerName="cinder-api" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.541260 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: E1205 06:13:04.541999 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50\": container with ID starting with 0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50 not found: ID does not exist" containerID="0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.542032 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50"} err="failed to get container status \"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50\": rpc error: code = NotFound desc = could not find container \"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50\": container with ID starting with 0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50 not found: ID does not exist" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.542053 4865 scope.go:117] "RemoveContainer" containerID="ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.545427 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.545756 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.546008 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.546203 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743"} err="failed to get container status \"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743\": rpc error: code = NotFound desc = could not find container \"ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743\": container with ID starting with ca37781a86890b9523950a753458bfc532354360aca4e299f6fe037a0d8c8743 not found: ID does not exist" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.546260 4865 scope.go:117] "RemoveContainer" containerID="0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.552442 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50"} err="failed to get container status \"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50\": rpc error: code = NotFound desc = could not find container \"0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50\": container with ID starting with 0ae43ed58fb9d3851ae6f94289b5e3e7bae7c09cf943ff34722d23d09804ff50 not found: ID does not exist" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.562657 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631046 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631367 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d368636-72ce-46db-ab44-91489de4985f-logs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631393 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-config-data\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631412 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-scripts\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631435 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d368636-72ce-46db-ab44-91489de4985f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631466 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-config-data-custom\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631522 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631601 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czxz4\" (UniqueName: \"kubernetes.io/projected/2d368636-72ce-46db-ab44-91489de4985f-kube-api-access-czxz4\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.631660 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734209 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734301 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czxz4\" (UniqueName: \"kubernetes.io/projected/2d368636-72ce-46db-ab44-91489de4985f-kube-api-access-czxz4\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734336 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734389 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734427 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d368636-72ce-46db-ab44-91489de4985f-logs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734470 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-config-data\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734489 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-scripts\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734509 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d368636-72ce-46db-ab44-91489de4985f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.734559 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-config-data-custom\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.735330 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d368636-72ce-46db-ab44-91489de4985f-logs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.739062 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d368636-72ce-46db-ab44-91489de4985f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.741586 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-config-data-custom\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.748473 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.751725 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-scripts\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.752586 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-config-data\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.755418 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.756056 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d368636-72ce-46db-ab44-91489de4985f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.763341 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czxz4\" (UniqueName: \"kubernetes.io/projected/2d368636-72ce-46db-ab44-91489de4985f-kube-api-access-czxz4\") pod \"cinder-api-0\" (UID: \"2d368636-72ce-46db-ab44-91489de4985f\") " pod="openstack/cinder-api-0" Dec 05 06:13:04 crc kubenswrapper[4865]: I1205 06:13:04.860490 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 06:13:05 crc kubenswrapper[4865]: I1205 06:13:05.031761 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152d76e8-934b-43df-af0a-892c08425ef8" path="/var/lib/kubelet/pods/152d76e8-934b-43df-af0a-892c08425ef8/volumes" Dec 05 06:13:05 crc kubenswrapper[4865]: I1205 06:13:05.032668 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1eecac4-4210-4a9d-9d8a-bcf21327c712" path="/var/lib/kubelet/pods/b1eecac4-4210-4a9d-9d8a-bcf21327c712/volumes" Dec 05 06:13:05 crc kubenswrapper[4865]: I1205 06:13:05.391982 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 06:13:05 crc kubenswrapper[4865]: I1205 06:13:05.502733 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d368636-72ce-46db-ab44-91489de4985f","Type":"ContainerStarted","Data":"23fc68bfa7da2555ad36ac473af9968262149a7fda47d8ad5a057b899f3db4b2"} Dec 05 06:13:05 crc kubenswrapper[4865]: I1205 06:13:05.519061 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerStarted","Data":"4e51448c5fc4740a17abb522c8ff60db79821597985e68c4c7cd22bf3e1ee371"} Dec 05 06:13:05 crc kubenswrapper[4865]: I1205 06:13:05.634752 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 06:13:06 crc kubenswrapper[4865]: I1205 06:13:06.577338 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerStarted","Data":"20385fef8ee369c6d6dcc40eacc4c66a07f7abda432c35cc31257f5ee9e37eee"} Dec 05 06:13:06 crc kubenswrapper[4865]: I1205 06:13:06.599712 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d368636-72ce-46db-ab44-91489de4985f","Type":"ContainerStarted","Data":"24991ebb20c67bd298967faf614858e210911c5647231520fcc077334dc11b60"} Dec 05 06:13:07 crc kubenswrapper[4865]: I1205 06:13:07.616028 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d368636-72ce-46db-ab44-91489de4985f","Type":"ContainerStarted","Data":"96698eb4d1737e1c8c0f56c3ddb8092d2cda5f63919c227ec6e725aba254f719"} Dec 05 06:13:07 crc kubenswrapper[4865]: I1205 06:13:07.616382 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 06:13:07 crc kubenswrapper[4865]: I1205 06:13:07.620203 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerStarted","Data":"28e55975500289a76a60e88233427279380804f71e42dd5e833d4cd473559599"} Dec 05 06:13:07 crc kubenswrapper[4865]: I1205 06:13:07.620546 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 06:13:07 crc kubenswrapper[4865]: I1205 06:13:07.706759 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.706730721 podStartE2EDuration="3.706730721s" podCreationTimestamp="2025-12-05 06:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:07.654725181 +0000 UTC m=+1206.934736443" watchObservedRunningTime="2025-12-05 06:13:07.706730721 +0000 UTC m=+1206.986741933" Dec 05 06:13:07 crc kubenswrapper[4865]: I1205 06:13:07.711721 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.533838455 podStartE2EDuration="5.711710713s" podCreationTimestamp="2025-12-05 06:13:02 +0000 UTC" firstStartedPulling="2025-12-05 06:13:03.528157598 +0000 UTC m=+1202.808168830" lastFinishedPulling="2025-12-05 06:13:06.706029866 +0000 UTC m=+1205.986041088" observedRunningTime="2025-12-05 06:13:07.694512973 +0000 UTC m=+1206.974524205" watchObservedRunningTime="2025-12-05 06:13:07.711710713 +0000 UTC m=+1206.991721935" Dec 05 06:13:09 crc kubenswrapper[4865]: I1205 06:13:09.867374 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:13:09 crc kubenswrapper[4865]: I1205 06:13:09.876867 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-699b5d9784-7n29d" Dec 05 06:13:10 crc kubenswrapper[4865]: I1205 06:13:10.614039 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:13:10 crc kubenswrapper[4865]: I1205 06:13:10.746993 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-xhd48"] Dec 05 06:13:10 crc kubenswrapper[4865]: I1205 06:13:10.747331 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerName="dnsmasq-dns" containerID="cri-o://fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3" gracePeriod=10 Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.026080 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.035128 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-668bb48dd6-6gzl7" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.086174 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.361582 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.492933 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-svc\") pod \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.493717 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s67cz\" (UniqueName: \"kubernetes.io/projected/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-kube-api-access-s67cz\") pod \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.493860 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-config\") pod \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.493935 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-nb\") pod \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.494067 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-swift-storage-0\") pod \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.494108 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-sb\") pod \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\" (UID: \"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd\") " Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.500553 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-kube-api-access-s67cz" (OuterVolumeSpecName: "kube-api-access-s67cz") pod "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" (UID: "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd"). InnerVolumeSpecName "kube-api-access-s67cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.556430 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" (UID: "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.561195 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" (UID: "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.563313 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" (UID: "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.594690 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" (UID: "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.595950 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.595974 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.595986 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.595995 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s67cz\" (UniqueName: \"kubernetes.io/projected/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-kube-api-access-s67cz\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.596005 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.600037 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-config" (OuterVolumeSpecName: "config") pod "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" (UID: "d8b0dc39-032b-4ed9-ae54-aafa8a0333cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.634777 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:11 crc kubenswrapper[4865]: E1205 06:13:11.635184 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerName="init" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.635199 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerName="init" Dec 05 06:13:11 crc kubenswrapper[4865]: E1205 06:13:11.635215 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerName="dnsmasq-dns" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.635222 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerName="dnsmasq-dns" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.635406 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerName="dnsmasq-dns" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.636006 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.640587 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.640647 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wz5fz" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.640761 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.654607 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.681359 4865 generic.go:334] "Generic (PLEG): container finished" podID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" containerID="fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3" exitCode=0 Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.681748 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="cinder-scheduler" containerID="cri-o://c8d6bffd36a1cd4ea8db3d2077e4bf16c21610b6d3550a995d678ca673f48fb4" gracePeriod=30 Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.682314 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.682510 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" event={"ID":"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd","Type":"ContainerDied","Data":"fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3"} Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.682568 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-xhd48" event={"ID":"d8b0dc39-032b-4ed9-ae54-aafa8a0333cd","Type":"ContainerDied","Data":"f88e220c982065db6825ad7b54ecd876bfd1f828a376c3119cc75f6ceb0f2b89"} Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.682806 4865 scope.go:117] "RemoveContainer" containerID="fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.683484 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="probe" containerID="cri-o://a6b05f7b5b392079e02325575cf4d7a615c0e0ed18646779c0c9faa1e6178acd" gracePeriod=30 Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.697966 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4592e250-3d32-4900-9049-f84c905ab474-openstack-config\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.698026 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.698079 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qgb\" (UniqueName: \"kubernetes.io/projected/4592e250-3d32-4900-9049-f84c905ab474-kube-api-access-w8qgb\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.698103 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-openstack-config-secret\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.698229 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.717464 4865 scope.go:117] "RemoveContainer" containerID="25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.729018 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-xhd48"] Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.736775 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-xhd48"] Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.745031 4865 scope.go:117] "RemoveContainer" containerID="fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3" Dec 05 06:13:11 crc kubenswrapper[4865]: E1205 06:13:11.746349 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3\": container with ID starting with fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3 not found: ID does not exist" containerID="fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.746489 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3"} err="failed to get container status \"fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3\": rpc error: code = NotFound desc = could not find container \"fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3\": container with ID starting with fb59ca69976da3003fbab50ecbd813c98bdbe16fdfb4dfbc8ed99d53dbea72b3 not found: ID does not exist" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.746580 4865 scope.go:117] "RemoveContainer" containerID="25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b" Dec 05 06:13:11 crc kubenswrapper[4865]: E1205 06:13:11.747015 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b\": container with ID starting with 25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b not found: ID does not exist" containerID="25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.747056 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b"} err="failed to get container status \"25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b\": rpc error: code = NotFound desc = could not find container \"25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b\": container with ID starting with 25cdd4d077f78826ee9b9198eebf8271678450c1765b49820cdca99051a67b7b not found: ID does not exist" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.799598 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4592e250-3d32-4900-9049-f84c905ab474-openstack-config\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.799690 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.799748 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qgb\" (UniqueName: \"kubernetes.io/projected/4592e250-3d32-4900-9049-f84c905ab474-kube-api-access-w8qgb\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.799768 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-openstack-config-secret\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.800509 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4592e250-3d32-4900-9049-f84c905ab474-openstack-config\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.804615 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.806502 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-openstack-config-secret\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.814875 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qgb\" (UniqueName: \"kubernetes.io/projected/4592e250-3d32-4900-9049-f84c905ab474-kube-api-access-w8qgb\") pod \"openstackclient\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.963507 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.964291 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:11 crc kubenswrapper[4865]: I1205 06:13:11.983335 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.006654 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.008695 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.029715 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.107945 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8kl\" (UniqueName: \"kubernetes.io/projected/98a93aae-b37f-4577-9567-e527f3cab3c7-kube-api-access-xz8kl\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.108001 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98a93aae-b37f-4577-9567-e527f3cab3c7-openstack-config\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.108059 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98a93aae-b37f-4577-9567-e527f3cab3c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.108093 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a93aae-b37f-4577-9567-e527f3cab3c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: E1205 06:13:12.149217 4865 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 05 06:13:12 crc kubenswrapper[4865]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_4592e250-3d32-4900-9049-f84c905ab474_0(d8217f2545a180e1c607856b8e43478fd368e3c5c77354c0348dede73fae47ff): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d8217f2545a180e1c607856b8e43478fd368e3c5c77354c0348dede73fae47ff" Netns:"/var/run/netns/0cc6748c-a0b9-4755-bb6f-8bba5d07bde7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=d8217f2545a180e1c607856b8e43478fd368e3c5c77354c0348dede73fae47ff;K8S_POD_UID=4592e250-3d32-4900-9049-f84c905ab474" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/4592e250-3d32-4900-9049-f84c905ab474]: expected pod UID "4592e250-3d32-4900-9049-f84c905ab474" but got "98a93aae-b37f-4577-9567-e527f3cab3c7" from Kube API Dec 05 06:13:12 crc kubenswrapper[4865]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 06:13:12 crc kubenswrapper[4865]: > Dec 05 06:13:12 crc kubenswrapper[4865]: E1205 06:13:12.149478 4865 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 05 06:13:12 crc kubenswrapper[4865]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_4592e250-3d32-4900-9049-f84c905ab474_0(d8217f2545a180e1c607856b8e43478fd368e3c5c77354c0348dede73fae47ff): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d8217f2545a180e1c607856b8e43478fd368e3c5c77354c0348dede73fae47ff" Netns:"/var/run/netns/0cc6748c-a0b9-4755-bb6f-8bba5d07bde7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=d8217f2545a180e1c607856b8e43478fd368e3c5c77354c0348dede73fae47ff;K8S_POD_UID=4592e250-3d32-4900-9049-f84c905ab474" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/4592e250-3d32-4900-9049-f84c905ab474]: expected pod UID "4592e250-3d32-4900-9049-f84c905ab474" but got "98a93aae-b37f-4577-9567-e527f3cab3c7" from Kube API Dec 05 06:13:12 crc kubenswrapper[4865]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 06:13:12 crc kubenswrapper[4865]: > pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.210200 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98a93aae-b37f-4577-9567-e527f3cab3c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.210343 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a93aae-b37f-4577-9567-e527f3cab3c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.211307 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8kl\" (UniqueName: \"kubernetes.io/projected/98a93aae-b37f-4577-9567-e527f3cab3c7-kube-api-access-xz8kl\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.211391 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98a93aae-b37f-4577-9567-e527f3cab3c7-openstack-config\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.215546 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98a93aae-b37f-4577-9567-e527f3cab3c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.217183 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98a93aae-b37f-4577-9567-e527f3cab3c7-openstack-config\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.217524 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a93aae-b37f-4577-9567-e527f3cab3c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.226778 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8kl\" (UniqueName: \"kubernetes.io/projected/98a93aae-b37f-4577-9567-e527f3cab3c7-kube-api-access-xz8kl\") pod \"openstackclient\" (UID: \"98a93aae-b37f-4577-9567-e527f3cab3c7\") " pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.482214 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.727884 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.733099 4865 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4592e250-3d32-4900-9049-f84c905ab474" podUID="98a93aae-b37f-4577-9567-e527f3cab3c7" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.754682 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.822787 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-openstack-config-secret\") pod \"4592e250-3d32-4900-9049-f84c905ab474\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.822893 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8qgb\" (UniqueName: \"kubernetes.io/projected/4592e250-3d32-4900-9049-f84c905ab474-kube-api-access-w8qgb\") pod \"4592e250-3d32-4900-9049-f84c905ab474\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.822941 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-combined-ca-bundle\") pod \"4592e250-3d32-4900-9049-f84c905ab474\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.823014 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4592e250-3d32-4900-9049-f84c905ab474-openstack-config\") pod \"4592e250-3d32-4900-9049-f84c905ab474\" (UID: \"4592e250-3d32-4900-9049-f84c905ab474\") " Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.823897 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4592e250-3d32-4900-9049-f84c905ab474-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4592e250-3d32-4900-9049-f84c905ab474" (UID: "4592e250-3d32-4900-9049-f84c905ab474"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.828430 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4592e250-3d32-4900-9049-f84c905ab474" (UID: "4592e250-3d32-4900-9049-f84c905ab474"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.828607 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4592e250-3d32-4900-9049-f84c905ab474-kube-api-access-w8qgb" (OuterVolumeSpecName: "kube-api-access-w8qgb") pod "4592e250-3d32-4900-9049-f84c905ab474" (UID: "4592e250-3d32-4900-9049-f84c905ab474"). InnerVolumeSpecName "kube-api-access-w8qgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.840124 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4592e250-3d32-4900-9049-f84c905ab474" (UID: "4592e250-3d32-4900-9049-f84c905ab474"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.925159 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8qgb\" (UniqueName: \"kubernetes.io/projected/4592e250-3d32-4900-9049-f84c905ab474-kube-api-access-w8qgb\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.925195 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.925204 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4592e250-3d32-4900-9049-f84c905ab474-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:12 crc kubenswrapper[4865]: I1205 06:13:12.925213 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4592e250-3d32-4900-9049-f84c905ab474-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.023426 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4592e250-3d32-4900-9049-f84c905ab474" path="/var/lib/kubelet/pods/4592e250-3d32-4900-9049-f84c905ab474/volumes" Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.023801 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b0dc39-032b-4ed9-ae54-aafa8a0333cd" path="/var/lib/kubelet/pods/d8b0dc39-032b-4ed9-ae54-aafa8a0333cd/volumes" Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.027243 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.738669 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"98a93aae-b37f-4577-9567-e527f3cab3c7","Type":"ContainerStarted","Data":"3d719d29492cf1e333032e54f81823f072d96ab7b2af26362e5282aa90090941"} Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.746040 4865 generic.go:334] "Generic (PLEG): container finished" podID="45262909-db71-460b-9e2d-2fcc4ae45748" containerID="a6b05f7b5b392079e02325575cf4d7a615c0e0ed18646779c0c9faa1e6178acd" exitCode=0 Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.746141 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.746133 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45262909-db71-460b-9e2d-2fcc4ae45748","Type":"ContainerDied","Data":"a6b05f7b5b392079e02325575cf4d7a615c0e0ed18646779c0c9faa1e6178acd"} Dec 05 06:13:13 crc kubenswrapper[4865]: I1205 06:13:13.753709 4865 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4592e250-3d32-4900-9049-f84c905ab474" podUID="98a93aae-b37f-4577-9567-e527f3cab3c7" Dec 05 06:13:14 crc kubenswrapper[4865]: I1205 06:13:14.795876 4865 generic.go:334] "Generic (PLEG): container finished" podID="45262909-db71-460b-9e2d-2fcc4ae45748" containerID="c8d6bffd36a1cd4ea8db3d2077e4bf16c21610b6d3550a995d678ca673f48fb4" exitCode=0 Dec 05 06:13:14 crc kubenswrapper[4865]: I1205 06:13:14.796171 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45262909-db71-460b-9e2d-2fcc4ae45748","Type":"ContainerDied","Data":"c8d6bffd36a1cd4ea8db3d2077e4bf16c21610b6d3550a995d678ca673f48fb4"} Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.026459 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.087250 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcm9d\" (UniqueName: \"kubernetes.io/projected/45262909-db71-460b-9e2d-2fcc4ae45748-kube-api-access-bcm9d\") pod \"45262909-db71-460b-9e2d-2fcc4ae45748\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.087337 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-combined-ca-bundle\") pod \"45262909-db71-460b-9e2d-2fcc4ae45748\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.087400 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data\") pod \"45262909-db71-460b-9e2d-2fcc4ae45748\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.087460 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-scripts\") pod \"45262909-db71-460b-9e2d-2fcc4ae45748\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.087504 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45262909-db71-460b-9e2d-2fcc4ae45748-etc-machine-id\") pod \"45262909-db71-460b-9e2d-2fcc4ae45748\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.087576 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data-custom\") pod \"45262909-db71-460b-9e2d-2fcc4ae45748\" (UID: \"45262909-db71-460b-9e2d-2fcc4ae45748\") " Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.088522 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45262909-db71-460b-9e2d-2fcc4ae45748-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "45262909-db71-460b-9e2d-2fcc4ae45748" (UID: "45262909-db71-460b-9e2d-2fcc4ae45748"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.125674 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-scripts" (OuterVolumeSpecName: "scripts") pod "45262909-db71-460b-9e2d-2fcc4ae45748" (UID: "45262909-db71-460b-9e2d-2fcc4ae45748"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.137516 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45262909-db71-460b-9e2d-2fcc4ae45748-kube-api-access-bcm9d" (OuterVolumeSpecName: "kube-api-access-bcm9d") pod "45262909-db71-460b-9e2d-2fcc4ae45748" (UID: "45262909-db71-460b-9e2d-2fcc4ae45748"). InnerVolumeSpecName "kube-api-access-bcm9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.182037 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45262909-db71-460b-9e2d-2fcc4ae45748" (UID: "45262909-db71-460b-9e2d-2fcc4ae45748"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.189989 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcm9d\" (UniqueName: \"kubernetes.io/projected/45262909-db71-460b-9e2d-2fcc4ae45748-kube-api-access-bcm9d\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.190014 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.190025 4865 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45262909-db71-460b-9e2d-2fcc4ae45748-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.190033 4865 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.263854 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45262909-db71-460b-9e2d-2fcc4ae45748" (UID: "45262909-db71-460b-9e2d-2fcc4ae45748"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.292292 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.300173 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data" (OuterVolumeSpecName: "config-data") pod "45262909-db71-460b-9e2d-2fcc4ae45748" (UID: "45262909-db71-460b-9e2d-2fcc4ae45748"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.394475 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45262909-db71-460b-9e2d-2fcc4ae45748-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.809470 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45262909-db71-460b-9e2d-2fcc4ae45748","Type":"ContainerDied","Data":"97c37d0a60f865cc6ceb26f58c5e8688eca75294351ced100815e0ec2d41ac02"} Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.809548 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.809797 4865 scope.go:117] "RemoveContainer" containerID="a6b05f7b5b392079e02325575cf4d7a615c0e0ed18646779c0c9faa1e6178acd" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.838368 4865 scope.go:117] "RemoveContainer" containerID="c8d6bffd36a1cd4ea8db3d2077e4bf16c21610b6d3550a995d678ca673f48fb4" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.855105 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.868370 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.894367 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:15 crc kubenswrapper[4865]: E1205 06:13:15.894930 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="probe" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.894951 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="probe" Dec 05 06:13:15 crc kubenswrapper[4865]: E1205 06:13:15.894978 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="cinder-scheduler" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.894985 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="cinder-scheduler" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.895212 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="probe" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.895245 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" containerName="cinder-scheduler" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.896443 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.902407 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 06:13:15 crc kubenswrapper[4865]: I1205 06:13:15.907138 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.005404 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55qq\" (UniqueName: \"kubernetes.io/projected/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-kube-api-access-f55qq\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.005461 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.005503 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.005532 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.005594 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.005631 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.107784 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.107894 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.108966 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.109058 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.109129 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f55qq\" (UniqueName: \"kubernetes.io/projected/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-kube-api-access-f55qq\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.109170 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.109269 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.116159 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.119332 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.120624 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.131735 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55qq\" (UniqueName: \"kubernetes.io/projected/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-kube-api-access-f55qq\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.148344 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9b36dc-b4e2-4a85-ab48-63bf2318e717-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d9b36dc-b4e2-4a85-ab48-63bf2318e717\") " pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.261869 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 06:13:16 crc kubenswrapper[4865]: I1205 06:13:16.833673 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 06:13:17 crc kubenswrapper[4865]: I1205 06:13:17.035782 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45262909-db71-460b-9e2d-2fcc4ae45748" path="/var/lib/kubelet/pods/45262909-db71-460b-9e2d-2fcc4ae45748/volumes" Dec 05 06:13:17 crc kubenswrapper[4865]: I1205 06:13:17.855775 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d9b36dc-b4e2-4a85-ab48-63bf2318e717","Type":"ContainerStarted","Data":"4583800d21aad2f4bfc8ece87f3e648ac006aca1c49f3f1985bcb2cc623e7b2d"} Dec 05 06:13:17 crc kubenswrapper[4865]: I1205 06:13:17.856355 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d9b36dc-b4e2-4a85-ab48-63bf2318e717","Type":"ContainerStarted","Data":"04b3d487229e23a741939de071178c7f14612a134b6dd15d511164b15f762993"} Dec 05 06:13:17 crc kubenswrapper[4865]: I1205 06:13:17.877550 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.213764 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d56774dc9-sps89"] Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.216295 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.218545 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.226156 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.226155 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.233481 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d56774dc9-sps89"] Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.298796 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ae1380d-b481-4842-a4e5-6e96ad87b998-log-httpd\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.298968 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ae1380d-b481-4842-a4e5-6e96ad87b998-run-httpd\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.299004 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-combined-ca-bundle\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.299069 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jtv\" (UniqueName: \"kubernetes.io/projected/5ae1380d-b481-4842-a4e5-6e96ad87b998-kube-api-access-77jtv\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.299118 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-config-data\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.299137 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5ae1380d-b481-4842-a4e5-6e96ad87b998-etc-swift\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.299158 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-internal-tls-certs\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.299182 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-public-tls-certs\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.388715 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.389088 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-central-agent" containerID="cri-o://80baee471c7b57386fc3c1131518875c6cab6dbb55b94445f72ca24cd853d82f" gracePeriod=30 Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.389492 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="proxy-httpd" containerID="cri-o://28e55975500289a76a60e88233427279380804f71e42dd5e833d4cd473559599" gracePeriod=30 Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.389581 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="sg-core" containerID="cri-o://20385fef8ee369c6d6dcc40eacc4c66a07f7abda432c35cc31257f5ee9e37eee" gracePeriod=30 Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.389612 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-notification-agent" containerID="cri-o://4e51448c5fc4740a17abb522c8ff60db79821597985e68c4c7cd22bf3e1ee371" gracePeriod=30 Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.400970 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ae1380d-b481-4842-a4e5-6e96ad87b998-run-httpd\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401052 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-combined-ca-bundle\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401095 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77jtv\" (UniqueName: \"kubernetes.io/projected/5ae1380d-b481-4842-a4e5-6e96ad87b998-kube-api-access-77jtv\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401146 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-config-data\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401166 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5ae1380d-b481-4842-a4e5-6e96ad87b998-etc-swift\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401197 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-internal-tls-certs\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401230 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-public-tls-certs\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401249 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.165:3000/\": EOF" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.401327 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ae1380d-b481-4842-a4e5-6e96ad87b998-log-httpd\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.402012 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ae1380d-b481-4842-a4e5-6e96ad87b998-log-httpd\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.402057 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ae1380d-b481-4842-a4e5-6e96ad87b998-run-httpd\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.408226 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5ae1380d-b481-4842-a4e5-6e96ad87b998-etc-swift\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.420387 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-config-data\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.422258 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-public-tls-certs\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.432486 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-internal-tls-certs\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.436218 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jtv\" (UniqueName: \"kubernetes.io/projected/5ae1380d-b481-4842-a4e5-6e96ad87b998-kube-api-access-77jtv\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.446772 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae1380d-b481-4842-a4e5-6e96ad87b998-combined-ca-bundle\") pod \"swift-proxy-d56774dc9-sps89\" (UID: \"5ae1380d-b481-4842-a4e5-6e96ad87b998\") " pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.559070 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.904135 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d9b36dc-b4e2-4a85-ab48-63bf2318e717","Type":"ContainerStarted","Data":"d2074ecde8b38cc0429cfbc6a1650fa12b92f08edb920f6f6d80f034fcf852ac"} Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.918111 4865 generic.go:334] "Generic (PLEG): container finished" podID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerID="28e55975500289a76a60e88233427279380804f71e42dd5e833d4cd473559599" exitCode=0 Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.918146 4865 generic.go:334] "Generic (PLEG): container finished" podID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerID="20385fef8ee369c6d6dcc40eacc4c66a07f7abda432c35cc31257f5ee9e37eee" exitCode=2 Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.918168 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerDied","Data":"28e55975500289a76a60e88233427279380804f71e42dd5e833d4cd473559599"} Dec 05 06:13:18 crc kubenswrapper[4865]: I1205 06:13:18.918194 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerDied","Data":"20385fef8ee369c6d6dcc40eacc4c66a07f7abda432c35cc31257f5ee9e37eee"} Dec 05 06:13:19 crc kubenswrapper[4865]: I1205 06:13:19.234725 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.234700983 podStartE2EDuration="4.234700983s" podCreationTimestamp="2025-12-05 06:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:18.928921879 +0000 UTC m=+1218.208933101" watchObservedRunningTime="2025-12-05 06:13:19.234700983 +0000 UTC m=+1218.514712205" Dec 05 06:13:19 crc kubenswrapper[4865]: I1205 06:13:19.236317 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d56774dc9-sps89"] Dec 05 06:13:20 crc kubenswrapper[4865]: I1205 06:13:20.510564 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d56774dc9-sps89" event={"ID":"5ae1380d-b481-4842-a4e5-6e96ad87b998","Type":"ContainerStarted","Data":"49ed760db83848cfd125f87d92b7d518387060ca494114e736c816984f2c9466"} Dec 05 06:13:20 crc kubenswrapper[4865]: I1205 06:13:20.517475 4865 generic.go:334] "Generic (PLEG): container finished" podID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerID="80baee471c7b57386fc3c1131518875c6cab6dbb55b94445f72ca24cd853d82f" exitCode=0 Dec 05 06:13:20 crc kubenswrapper[4865]: I1205 06:13:20.518748 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerDied","Data":"80baee471c7b57386fc3c1131518875c6cab6dbb55b94445f72ca24cd853d82f"} Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.262930 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.467533 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-r4lqw"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.477516 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.485653 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r4lqw"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.553949 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d56774dc9-sps89" event={"ID":"5ae1380d-b481-4842-a4e5-6e96ad87b998","Type":"ContainerStarted","Data":"0b2d87c3b18b77923d38567bf768334ad317b29016c4bc6221847f4ba0782c8c"} Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.571180 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6lclk"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.573237 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.599100 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6lclk"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.623554 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9zds\" (UniqueName: \"kubernetes.io/projected/ce59a384-76dd-444f-8e42-e4eb194e48e9-kube-api-access-v9zds\") pod \"nova-api-db-create-r4lqw\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.623651 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce59a384-76dd-444f-8e42-e4eb194e48e9-operator-scripts\") pod \"nova-api-db-create-r4lqw\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.722079 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f091-account-create-update-c4wdz"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.723370 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.724800 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9zds\" (UniqueName: \"kubernetes.io/projected/ce59a384-76dd-444f-8e42-e4eb194e48e9-kube-api-access-v9zds\") pod \"nova-api-db-create-r4lqw\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.724864 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mll55\" (UniqueName: \"kubernetes.io/projected/a8240c19-58ea-4d42-ac99-121c7f01e2f2-kube-api-access-mll55\") pod \"nova-cell0-db-create-6lclk\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.724917 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce59a384-76dd-444f-8e42-e4eb194e48e9-operator-scripts\") pod \"nova-api-db-create-r4lqw\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.725000 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8240c19-58ea-4d42-ac99-121c7f01e2f2-operator-scripts\") pod \"nova-cell0-db-create-6lclk\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.726559 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce59a384-76dd-444f-8e42-e4eb194e48e9-operator-scripts\") pod \"nova-api-db-create-r4lqw\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.728301 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.756224 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f091-account-create-update-c4wdz"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.774048 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8qkdx"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.775257 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.784729 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9zds\" (UniqueName: \"kubernetes.io/projected/ce59a384-76dd-444f-8e42-e4eb194e48e9-kube-api-access-v9zds\") pod \"nova-api-db-create-r4lqw\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.803234 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8qkdx"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.811417 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.826743 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7733d2ef-e9ed-4977-b294-3941be6b9455-operator-scripts\") pod \"nova-api-f091-account-create-update-c4wdz\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.826835 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsxwv\" (UniqueName: \"kubernetes.io/projected/7733d2ef-e9ed-4977-b294-3941be6b9455-kube-api-access-nsxwv\") pod \"nova-api-f091-account-create-update-c4wdz\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.826887 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8240c19-58ea-4d42-ac99-121c7f01e2f2-operator-scripts\") pod \"nova-cell0-db-create-6lclk\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.826964 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mll55\" (UniqueName: \"kubernetes.io/projected/a8240c19-58ea-4d42-ac99-121c7f01e2f2-kube-api-access-mll55\") pod \"nova-cell0-db-create-6lclk\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.828128 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8240c19-58ea-4d42-ac99-121c7f01e2f2-operator-scripts\") pod \"nova-cell0-db-create-6lclk\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.863397 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4d95-account-create-update-xsbvp"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.864491 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mll55\" (UniqueName: \"kubernetes.io/projected/a8240c19-58ea-4d42-ac99-121c7f01e2f2-kube-api-access-mll55\") pod \"nova-cell0-db-create-6lclk\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.864723 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.867968 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.875873 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4d95-account-create-update-xsbvp"] Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.907987 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.929449 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/519c901d-31cc-4600-96bb-66ecd95aba90-operator-scripts\") pod \"nova-cell1-db-create-8qkdx\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.929584 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7733d2ef-e9ed-4977-b294-3941be6b9455-operator-scripts\") pod \"nova-api-f091-account-create-update-c4wdz\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.929610 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrp6k\" (UniqueName: \"kubernetes.io/projected/519c901d-31cc-4600-96bb-66ecd95aba90-kube-api-access-xrp6k\") pod \"nova-cell1-db-create-8qkdx\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.929648 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsxwv\" (UniqueName: \"kubernetes.io/projected/7733d2ef-e9ed-4977-b294-3941be6b9455-kube-api-access-nsxwv\") pod \"nova-api-f091-account-create-update-c4wdz\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.930501 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7733d2ef-e9ed-4977-b294-3941be6b9455-operator-scripts\") pod \"nova-api-f091-account-create-update-c4wdz\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:21 crc kubenswrapper[4865]: I1205 06:13:21.949095 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsxwv\" (UniqueName: \"kubernetes.io/projected/7733d2ef-e9ed-4977-b294-3941be6b9455-kube-api-access-nsxwv\") pod \"nova-api-f091-account-create-update-c4wdz\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.031358 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b7332b-f0c3-4e97-9149-0b09e4f74727-operator-scripts\") pod \"nova-cell0-4d95-account-create-update-xsbvp\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.031410 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/519c901d-31cc-4600-96bb-66ecd95aba90-operator-scripts\") pod \"nova-cell1-db-create-8qkdx\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.031449 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngglw\" (UniqueName: \"kubernetes.io/projected/74b7332b-f0c3-4e97-9149-0b09e4f74727-kube-api-access-ngglw\") pod \"nova-cell0-4d95-account-create-update-xsbvp\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.031556 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrp6k\" (UniqueName: \"kubernetes.io/projected/519c901d-31cc-4600-96bb-66ecd95aba90-kube-api-access-xrp6k\") pod \"nova-cell1-db-create-8qkdx\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.032461 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/519c901d-31cc-4600-96bb-66ecd95aba90-operator-scripts\") pod \"nova-cell1-db-create-8qkdx\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.051321 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.053436 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-eee8-account-create-update-2jnjx"] Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.054597 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.057412 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrp6k\" (UniqueName: \"kubernetes.io/projected/519c901d-31cc-4600-96bb-66ecd95aba90-kube-api-access-xrp6k\") pod \"nova-cell1-db-create-8qkdx\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.058056 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.083515 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eee8-account-create-update-2jnjx"] Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.132156 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.142038 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b7332b-f0c3-4e97-9149-0b09e4f74727-operator-scripts\") pod \"nova-cell0-4d95-account-create-update-xsbvp\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.142139 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngglw\" (UniqueName: \"kubernetes.io/projected/74b7332b-f0c3-4e97-9149-0b09e4f74727-kube-api-access-ngglw\") pod \"nova-cell0-4d95-account-create-update-xsbvp\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.144118 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b7332b-f0c3-4e97-9149-0b09e4f74727-operator-scripts\") pod \"nova-cell0-4d95-account-create-update-xsbvp\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.171783 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngglw\" (UniqueName: \"kubernetes.io/projected/74b7332b-f0c3-4e97-9149-0b09e4f74727-kube-api-access-ngglw\") pod \"nova-cell0-4d95-account-create-update-xsbvp\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.226718 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.244671 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cec8b30-efea-4cd5-be6c-889b5dc59008-operator-scripts\") pod \"nova-cell1-eee8-account-create-update-2jnjx\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.244798 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdpjp\" (UniqueName: \"kubernetes.io/projected/2cec8b30-efea-4cd5-be6c-889b5dc59008-kube-api-access-rdpjp\") pod \"nova-cell1-eee8-account-create-update-2jnjx\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.346312 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cec8b30-efea-4cd5-be6c-889b5dc59008-operator-scripts\") pod \"nova-cell1-eee8-account-create-update-2jnjx\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.346435 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdpjp\" (UniqueName: \"kubernetes.io/projected/2cec8b30-efea-4cd5-be6c-889b5dc59008-kube-api-access-rdpjp\") pod \"nova-cell1-eee8-account-create-update-2jnjx\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.347627 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cec8b30-efea-4cd5-be6c-889b5dc59008-operator-scripts\") pod \"nova-cell1-eee8-account-create-update-2jnjx\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.364167 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdpjp\" (UniqueName: \"kubernetes.io/projected/2cec8b30-efea-4cd5-be6c-889b5dc59008-kube-api-access-rdpjp\") pod \"nova-cell1-eee8-account-create-update-2jnjx\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.423668 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.566848 4865 generic.go:334] "Generic (PLEG): container finished" podID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerID="4e51448c5fc4740a17abb522c8ff60db79821597985e68c4c7cd22bf3e1ee371" exitCode=0 Dec 05 06:13:22 crc kubenswrapper[4865]: I1205 06:13:22.566899 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerDied","Data":"4e51448c5fc4740a17abb522c8ff60db79821597985e68c4c7cd22bf3e1ee371"} Dec 05 06:13:26 crc kubenswrapper[4865]: I1205 06:13:26.500367 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 06:13:28 crc kubenswrapper[4865]: I1205 06:13:28.982210 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.096835 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-sg-core-conf-yaml\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.096901 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-combined-ca-bundle\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.097065 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-scripts\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.097101 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-log-httpd\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.097138 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-run-httpd\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.097211 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66jvh\" (UniqueName: \"kubernetes.io/projected/4e1e4be9-e191-4245-b107-81e7ea608c7c-kube-api-access-66jvh\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.097242 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-config-data\") pod \"4e1e4be9-e191-4245-b107-81e7ea608c7c\" (UID: \"4e1e4be9-e191-4245-b107-81e7ea608c7c\") " Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.107146 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.107697 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.109943 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-scripts" (OuterVolumeSpecName: "scripts") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.131348 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1e4be9-e191-4245-b107-81e7ea608c7c-kube-api-access-66jvh" (OuterVolumeSpecName: "kube-api-access-66jvh") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "kube-api-access-66jvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.207400 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66jvh\" (UniqueName: \"kubernetes.io/projected/4e1e4be9-e191-4245-b107-81e7ea608c7c-kube-api-access-66jvh\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.207434 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.207443 4865 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.207452 4865 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e1e4be9-e191-4245-b107-81e7ea608c7c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.261236 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.309468 4865 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.397759 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.411062 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.439139 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-config-data" (OuterVolumeSpecName: "config-data") pod "4e1e4be9-e191-4245-b107-81e7ea608c7c" (UID: "4e1e4be9-e191-4245-b107-81e7ea608c7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.512769 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1e4be9-e191-4245-b107-81e7ea608c7c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.513657 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eee8-account-create-update-2jnjx"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.523533 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4d95-account-create-update-xsbvp"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.634639 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r4lqw"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.766221 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6lclk"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.782021 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8qkdx"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.782982 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d56774dc9-sps89" event={"ID":"5ae1380d-b481-4842-a4e5-6e96ad87b998","Type":"ContainerStarted","Data":"dc3e4c16488dfa39846d055f18d0c74ea6170fb2b91bbb1076547c870cb6f080"} Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.783614 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.783994 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.795296 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" event={"ID":"74b7332b-f0c3-4e97-9149-0b09e4f74727","Type":"ContainerStarted","Data":"96c95db86e4b6ec4ace4e31bef317942a8ef48cfa15c89bd1e039c242da5f224"} Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.805792 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r4lqw" event={"ID":"ce59a384-76dd-444f-8e42-e4eb194e48e9","Type":"ContainerStarted","Data":"378d4df83daf1f535eeb0b728a0e01fb36a7c98f7f8e3fe7fd464bd74687a9a2"} Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.806606 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-d56774dc9-sps89" podUID="5ae1380d-b481-4842-a4e5-6e96ad87b998" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.821575 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" event={"ID":"2cec8b30-efea-4cd5-be6c-889b5dc59008","Type":"ContainerStarted","Data":"dd7e535458ef9869af6073c8dbb9ba45bd94808b5bfc0967e6fc3a92ccace120"} Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.821965 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d56774dc9-sps89" podStartSLOduration=11.821944737 podStartE2EDuration="11.821944737s" podCreationTimestamp="2025-12-05 06:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:29.819408805 +0000 UTC m=+1229.099420027" watchObservedRunningTime="2025-12-05 06:13:29.821944737 +0000 UTC m=+1229.101955959" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.827022 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"98a93aae-b37f-4577-9567-e527f3cab3c7","Type":"ContainerStarted","Data":"015aa7e2c8a6aa4179f53fbe7e5e2dd97712abfa1b8f1490c6e99fec06f6cecc"} Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.855152 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.076394063 podStartE2EDuration="18.855121982s" podCreationTimestamp="2025-12-05 06:13:11 +0000 UTC" firstStartedPulling="2025-12-05 06:13:13.023589815 +0000 UTC m=+1212.303601037" lastFinishedPulling="2025-12-05 06:13:28.802317734 +0000 UTC m=+1228.082328956" observedRunningTime="2025-12-05 06:13:29.848835673 +0000 UTC m=+1229.128846905" watchObservedRunningTime="2025-12-05 06:13:29.855121982 +0000 UTC m=+1229.135133204" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.863153 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e1e4be9-e191-4245-b107-81e7ea608c7c","Type":"ContainerDied","Data":"851740a7a1de52247f4b0c2945b81c6075a7e147855f123b5bec5720ab704cfb"} Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.863211 4865 scope.go:117] "RemoveContainer" containerID="28e55975500289a76a60e88233427279380804f71e42dd5e833d4cd473559599" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.863424 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.892182 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f091-account-create-update-c4wdz"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.929859 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.956639 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.978891 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:29 crc kubenswrapper[4865]: E1205 06:13:29.979345 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-central-agent" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979362 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-central-agent" Dec 05 06:13:29 crc kubenswrapper[4865]: E1205 06:13:29.979379 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="proxy-httpd" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979385 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="proxy-httpd" Dec 05 06:13:29 crc kubenswrapper[4865]: E1205 06:13:29.979402 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-notification-agent" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979408 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-notification-agent" Dec 05 06:13:29 crc kubenswrapper[4865]: E1205 06:13:29.979441 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="sg-core" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979446 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="sg-core" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979622 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-notification-agent" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979644 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="ceilometer-central-agent" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979661 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="sg-core" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.979671 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" containerName="proxy-httpd" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.981358 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.989051 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.994054 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:13:29 crc kubenswrapper[4865]: I1205 06:13:29.994097 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038645 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-scripts\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038679 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-config-data\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038739 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-run-httpd\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038765 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038801 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038830 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5ttt\" (UniqueName: \"kubernetes.io/projected/0f6d527d-88fb-4cce-becf-d744ce4cc27c-kube-api-access-p5ttt\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.038868 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-log-httpd\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.062165 4865 scope.go:117] "RemoveContainer" containerID="20385fef8ee369c6d6dcc40eacc4c66a07f7abda432c35cc31257f5ee9e37eee" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.124053 4865 scope.go:117] "RemoveContainer" containerID="4e51448c5fc4740a17abb522c8ff60db79821597985e68c4c7cd22bf3e1ee371" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141080 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-scripts\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141751 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-config-data\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141843 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-run-httpd\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141872 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141909 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141932 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5ttt\" (UniqueName: \"kubernetes.io/projected/0f6d527d-88fb-4cce-becf-d744ce4cc27c-kube-api-access-p5ttt\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.141974 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-log-httpd\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.143313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-log-httpd\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.144295 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-run-httpd\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.150003 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-scripts\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.150388 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-config-data\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.167539 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.173148 4865 scope.go:117] "RemoveContainer" containerID="80baee471c7b57386fc3c1131518875c6cab6dbb55b94445f72ca24cd853d82f" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.181668 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.191259 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5ttt\" (UniqueName: \"kubernetes.io/projected/0f6d527d-88fb-4cce-becf-d744ce4cc27c-kube-api-access-p5ttt\") pod \"ceilometer-0\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.326162 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.585639 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-d56774dc9-sps89" podUID="5ae1380d-b481-4842-a4e5-6e96ad87b998" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.777678 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.874939 4865 generic.go:334] "Generic (PLEG): container finished" podID="519c901d-31cc-4600-96bb-66ecd95aba90" containerID="066d65d1a75ea3c41d2ddb1ec2ab58bf494f925726dba1b8b5732272a6307b53" exitCode=0 Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.875026 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8qkdx" event={"ID":"519c901d-31cc-4600-96bb-66ecd95aba90","Type":"ContainerDied","Data":"066d65d1a75ea3c41d2ddb1ec2ab58bf494f925726dba1b8b5732272a6307b53"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.875055 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8qkdx" event={"ID":"519c901d-31cc-4600-96bb-66ecd95aba90","Type":"ContainerStarted","Data":"6b36f12f0a88c883bc1e52977d60cb6462a25b7da404d27b3676c2b16a4deac2"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.876726 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f091-account-create-update-c4wdz" event={"ID":"7733d2ef-e9ed-4977-b294-3941be6b9455","Type":"ContainerStarted","Data":"5be37aa6e0a55b62c7458bf4560c5ec09f2218715a4863038da8dfbf2f86bdf4"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.876769 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f091-account-create-update-c4wdz" event={"ID":"7733d2ef-e9ed-4977-b294-3941be6b9455","Type":"ContainerStarted","Data":"458a8084261d629a73a02079a1d50e4cb171ed6c7ffcaa16654ed3796474d327"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.878797 4865 generic.go:334] "Generic (PLEG): container finished" podID="ce59a384-76dd-444f-8e42-e4eb194e48e9" containerID="9618f2107757ba62a6db52c30717edd999f8918dfb8277916fb5e111d328acbe" exitCode=0 Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.878898 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r4lqw" event={"ID":"ce59a384-76dd-444f-8e42-e4eb194e48e9","Type":"ContainerDied","Data":"9618f2107757ba62a6db52c30717edd999f8918dfb8277916fb5e111d328acbe"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.880544 4865 generic.go:334] "Generic (PLEG): container finished" podID="2cec8b30-efea-4cd5-be6c-889b5dc59008" containerID="d7e05362ff84d81aa3cde40cc9f1c8996030cf683f693687fb8224024bf65310" exitCode=0 Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.880611 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" event={"ID":"2cec8b30-efea-4cd5-be6c-889b5dc59008","Type":"ContainerDied","Data":"d7e05362ff84d81aa3cde40cc9f1c8996030cf683f693687fb8224024bf65310"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.882335 4865 generic.go:334] "Generic (PLEG): container finished" podID="74b7332b-f0c3-4e97-9149-0b09e4f74727" containerID="58f16f1f4e44058939b4a82217da1cf07e666d16acef06252e4f4dfa4c57709f" exitCode=0 Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.882394 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" event={"ID":"74b7332b-f0c3-4e97-9149-0b09e4f74727","Type":"ContainerDied","Data":"58f16f1f4e44058939b4a82217da1cf07e666d16acef06252e4f4dfa4c57709f"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.885447 4865 generic.go:334] "Generic (PLEG): container finished" podID="a8240c19-58ea-4d42-ac99-121c7f01e2f2" containerID="32a3bbce1be81ebb05c72067d0b57678445c07251b83bb02f85a5a839fb2dfca" exitCode=0 Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.886348 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6lclk" event={"ID":"a8240c19-58ea-4d42-ac99-121c7f01e2f2","Type":"ContainerDied","Data":"32a3bbce1be81ebb05c72067d0b57678445c07251b83bb02f85a5a839fb2dfca"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.886376 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6lclk" event={"ID":"a8240c19-58ea-4d42-ac99-121c7f01e2f2","Type":"ContainerStarted","Data":"03f23519f85977383be83732e818c496f39fa72170555beff3d342347fec9f4d"} Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.910848 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-d56774dc9-sps89" podUID="5ae1380d-b481-4842-a4e5-6e96ad87b998" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.956703 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-f091-account-create-update-c4wdz" podStartSLOduration=9.956683348 podStartE2EDuration="9.956683348s" podCreationTimestamp="2025-12-05 06:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:30.954029092 +0000 UTC m=+1230.234040304" watchObservedRunningTime="2025-12-05 06:13:30.956683348 +0000 UTC m=+1230.236694570" Dec 05 06:13:30 crc kubenswrapper[4865]: I1205 06:13:30.971550 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:30 crc kubenswrapper[4865]: W1205 06:13:30.989383 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f6d527d_88fb_4cce_becf_d744ce4cc27c.slice/crio-9c00786eec954fecc17f7117a1f894fbd8f7878aca53880f692297eadd3d31e3 WatchSource:0}: Error finding container 9c00786eec954fecc17f7117a1f894fbd8f7878aca53880f692297eadd3d31e3: Status 404 returned error can't find the container with id 9c00786eec954fecc17f7117a1f894fbd8f7878aca53880f692297eadd3d31e3 Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.035693 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1e4be9-e191-4245-b107-81e7ea608c7c" path="/var/lib/kubelet/pods/4e1e4be9-e191-4245-b107-81e7ea608c7c/volumes" Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.909096 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerStarted","Data":"31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77"} Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.909361 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerStarted","Data":"9c00786eec954fecc17f7117a1f894fbd8f7878aca53880f692297eadd3d31e3"} Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.924547 4865 generic.go:334] "Generic (PLEG): container finished" podID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerID="ae9988b24b0cc529f27a61e58c049c77ec8edcedb21946f5111a11587be650d1" exitCode=137 Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.924620 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerDied","Data":"ae9988b24b0cc529f27a61e58c049c77ec8edcedb21946f5111a11587be650d1"} Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.924648 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerStarted","Data":"ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217"} Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.935328 4865 generic.go:334] "Generic (PLEG): container finished" podID="7733d2ef-e9ed-4977-b294-3941be6b9455" containerID="5be37aa6e0a55b62c7458bf4560c5ec09f2218715a4863038da8dfbf2f86bdf4" exitCode=0 Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.935401 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f091-account-create-update-c4wdz" event={"ID":"7733d2ef-e9ed-4977-b294-3941be6b9455","Type":"ContainerDied","Data":"5be37aa6e0a55b62c7458bf4560c5ec09f2218715a4863038da8dfbf2f86bdf4"} Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.960378 4865 generic.go:334] "Generic (PLEG): container finished" podID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerID="2c566b0fa9fad49639e8ef5098b129e44f8c6799cb1513e54a1766170e2190fd" exitCode=137 Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.960581 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c59b79fd-5jlv4" event={"ID":"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9","Type":"ContainerDied","Data":"2c566b0fa9fad49639e8ef5098b129e44f8c6799cb1513e54a1766170e2190fd"} Dec 05 06:13:31 crc kubenswrapper[4865]: I1205 06:13:31.960608 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c59b79fd-5jlv4" event={"ID":"0b2dbfc6-6978-4613-a307-d4d4b4b88bc9","Type":"ContainerStarted","Data":"bef14102a1f8185582b72db84c6f23eff098338f9d3d9e5bf20ed04a490b97a4"} Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.215409 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.563369 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.612155 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/519c901d-31cc-4600-96bb-66ecd95aba90-operator-scripts\") pod \"519c901d-31cc-4600-96bb-66ecd95aba90\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.612641 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrp6k\" (UniqueName: \"kubernetes.io/projected/519c901d-31cc-4600-96bb-66ecd95aba90-kube-api-access-xrp6k\") pod \"519c901d-31cc-4600-96bb-66ecd95aba90\" (UID: \"519c901d-31cc-4600-96bb-66ecd95aba90\") " Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.613163 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/519c901d-31cc-4600-96bb-66ecd95aba90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "519c901d-31cc-4600-96bb-66ecd95aba90" (UID: "519c901d-31cc-4600-96bb-66ecd95aba90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.656568 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519c901d-31cc-4600-96bb-66ecd95aba90-kube-api-access-xrp6k" (OuterVolumeSpecName: "kube-api-access-xrp6k") pod "519c901d-31cc-4600-96bb-66ecd95aba90" (UID: "519c901d-31cc-4600-96bb-66ecd95aba90"). InnerVolumeSpecName "kube-api-access-xrp6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.716018 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/519c901d-31cc-4600-96bb-66ecd95aba90-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.716051 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrp6k\" (UniqueName: \"kubernetes.io/projected/519c901d-31cc-4600-96bb-66ecd95aba90-kube-api-access-xrp6k\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.895717 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.901132 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.906113 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.911315 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.975261 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" event={"ID":"74b7332b-f0c3-4e97-9149-0b09e4f74727","Type":"ContainerDied","Data":"96c95db86e4b6ec4ace4e31bef317942a8ef48cfa15c89bd1e039c242da5f224"} Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.975299 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96c95db86e4b6ec4ace4e31bef317942a8ef48cfa15c89bd1e039c242da5f224" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.975357 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d95-account-create-update-xsbvp" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.986348 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerStarted","Data":"4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee"} Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.988927 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6lclk" event={"ID":"a8240c19-58ea-4d42-ac99-121c7f01e2f2","Type":"ContainerDied","Data":"03f23519f85977383be83732e818c496f39fa72170555beff3d342347fec9f4d"} Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.988952 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03f23519f85977383be83732e818c496f39fa72170555beff3d342347fec9f4d" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.989009 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lclk" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.995562 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r4lqw" event={"ID":"ce59a384-76dd-444f-8e42-e4eb194e48e9","Type":"ContainerDied","Data":"378d4df83daf1f535eeb0b728a0e01fb36a7c98f7f8e3fe7fd464bd74687a9a2"} Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.995588 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="378d4df83daf1f535eeb0b728a0e01fb36a7c98f7f8e3fe7fd464bd74687a9a2" Dec 05 06:13:32 crc kubenswrapper[4865]: I1205 06:13:32.995630 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r4lqw" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.000335 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qkdx" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.000428 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8qkdx" event={"ID":"519c901d-31cc-4600-96bb-66ecd95aba90","Type":"ContainerDied","Data":"6b36f12f0a88c883bc1e52977d60cb6462a25b7da404d27b3676c2b16a4deac2"} Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.000496 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b36f12f0a88c883bc1e52977d60cb6462a25b7da404d27b3676c2b16a4deac2" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.002352 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.002782 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eee8-account-create-update-2jnjx" event={"ID":"2cec8b30-efea-4cd5-be6c-889b5dc59008","Type":"ContainerDied","Data":"dd7e535458ef9869af6073c8dbb9ba45bd94808b5bfc0967e6fc3a92ccace120"} Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.002802 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7e535458ef9869af6073c8dbb9ba45bd94808b5bfc0967e6fc3a92ccace120" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049285 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9zds\" (UniqueName: \"kubernetes.io/projected/ce59a384-76dd-444f-8e42-e4eb194e48e9-kube-api-access-v9zds\") pod \"ce59a384-76dd-444f-8e42-e4eb194e48e9\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049350 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngglw\" (UniqueName: \"kubernetes.io/projected/74b7332b-f0c3-4e97-9149-0b09e4f74727-kube-api-access-ngglw\") pod \"74b7332b-f0c3-4e97-9149-0b09e4f74727\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049440 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b7332b-f0c3-4e97-9149-0b09e4f74727-operator-scripts\") pod \"74b7332b-f0c3-4e97-9149-0b09e4f74727\" (UID: \"74b7332b-f0c3-4e97-9149-0b09e4f74727\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049501 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8240c19-58ea-4d42-ac99-121c7f01e2f2-operator-scripts\") pod \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049535 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdpjp\" (UniqueName: \"kubernetes.io/projected/2cec8b30-efea-4cd5-be6c-889b5dc59008-kube-api-access-rdpjp\") pod \"2cec8b30-efea-4cd5-be6c-889b5dc59008\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049600 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce59a384-76dd-444f-8e42-e4eb194e48e9-operator-scripts\") pod \"ce59a384-76dd-444f-8e42-e4eb194e48e9\" (UID: \"ce59a384-76dd-444f-8e42-e4eb194e48e9\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049616 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cec8b30-efea-4cd5-be6c-889b5dc59008-operator-scripts\") pod \"2cec8b30-efea-4cd5-be6c-889b5dc59008\" (UID: \"2cec8b30-efea-4cd5-be6c-889b5dc59008\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.049653 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mll55\" (UniqueName: \"kubernetes.io/projected/a8240c19-58ea-4d42-ac99-121c7f01e2f2-kube-api-access-mll55\") pod \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\" (UID: \"a8240c19-58ea-4d42-ac99-121c7f01e2f2\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.051211 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b7332b-f0c3-4e97-9149-0b09e4f74727-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74b7332b-f0c3-4e97-9149-0b09e4f74727" (UID: "74b7332b-f0c3-4e97-9149-0b09e4f74727"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.053136 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8240c19-58ea-4d42-ac99-121c7f01e2f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8240c19-58ea-4d42-ac99-121c7f01e2f2" (UID: "a8240c19-58ea-4d42-ac99-121c7f01e2f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.054185 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cec8b30-efea-4cd5-be6c-889b5dc59008-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cec8b30-efea-4cd5-be6c-889b5dc59008" (UID: "2cec8b30-efea-4cd5-be6c-889b5dc59008"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.055646 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce59a384-76dd-444f-8e42-e4eb194e48e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce59a384-76dd-444f-8e42-e4eb194e48e9" (UID: "ce59a384-76dd-444f-8e42-e4eb194e48e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.055868 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce59a384-76dd-444f-8e42-e4eb194e48e9-kube-api-access-v9zds" (OuterVolumeSpecName: "kube-api-access-v9zds") pod "ce59a384-76dd-444f-8e42-e4eb194e48e9" (UID: "ce59a384-76dd-444f-8e42-e4eb194e48e9"). InnerVolumeSpecName "kube-api-access-v9zds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.062003 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8240c19-58ea-4d42-ac99-121c7f01e2f2-kube-api-access-mll55" (OuterVolumeSpecName: "kube-api-access-mll55") pod "a8240c19-58ea-4d42-ac99-121c7f01e2f2" (UID: "a8240c19-58ea-4d42-ac99-121c7f01e2f2"). InnerVolumeSpecName "kube-api-access-mll55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.063041 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b7332b-f0c3-4e97-9149-0b09e4f74727-kube-api-access-ngglw" (OuterVolumeSpecName: "kube-api-access-ngglw") pod "74b7332b-f0c3-4e97-9149-0b09e4f74727" (UID: "74b7332b-f0c3-4e97-9149-0b09e4f74727"). InnerVolumeSpecName "kube-api-access-ngglw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.064637 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cec8b30-efea-4cd5-be6c-889b5dc59008-kube-api-access-rdpjp" (OuterVolumeSpecName: "kube-api-access-rdpjp") pod "2cec8b30-efea-4cd5-be6c-889b5dc59008" (UID: "2cec8b30-efea-4cd5-be6c-889b5dc59008"). InnerVolumeSpecName "kube-api-access-rdpjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151707 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b7332b-f0c3-4e97-9149-0b09e4f74727-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151733 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8240c19-58ea-4d42-ac99-121c7f01e2f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151743 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdpjp\" (UniqueName: \"kubernetes.io/projected/2cec8b30-efea-4cd5-be6c-889b5dc59008-kube-api-access-rdpjp\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151753 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce59a384-76dd-444f-8e42-e4eb194e48e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151761 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cec8b30-efea-4cd5-be6c-889b5dc59008-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151770 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mll55\" (UniqueName: \"kubernetes.io/projected/a8240c19-58ea-4d42-ac99-121c7f01e2f2-kube-api-access-mll55\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151779 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9zds\" (UniqueName: \"kubernetes.io/projected/ce59a384-76dd-444f-8e42-e4eb194e48e9-kube-api-access-v9zds\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.151787 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngglw\" (UniqueName: \"kubernetes.io/projected/74b7332b-f0c3-4e97-9149-0b09e4f74727-kube-api-access-ngglw\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.341012 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.459699 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7733d2ef-e9ed-4977-b294-3941be6b9455-operator-scripts\") pod \"7733d2ef-e9ed-4977-b294-3941be6b9455\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.459933 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsxwv\" (UniqueName: \"kubernetes.io/projected/7733d2ef-e9ed-4977-b294-3941be6b9455-kube-api-access-nsxwv\") pod \"7733d2ef-e9ed-4977-b294-3941be6b9455\" (UID: \"7733d2ef-e9ed-4977-b294-3941be6b9455\") " Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.461453 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7733d2ef-e9ed-4977-b294-3941be6b9455-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7733d2ef-e9ed-4977-b294-3941be6b9455" (UID: "7733d2ef-e9ed-4977-b294-3941be6b9455"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.471970 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7733d2ef-e9ed-4977-b294-3941be6b9455-kube-api-access-nsxwv" (OuterVolumeSpecName: "kube-api-access-nsxwv") pod "7733d2ef-e9ed-4977-b294-3941be6b9455" (UID: "7733d2ef-e9ed-4977-b294-3941be6b9455"). InnerVolumeSpecName "kube-api-access-nsxwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:33 crc kubenswrapper[4865]: E1205 06:13:33.492058 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8240c19_58ea_4d42_ac99_121c7f01e2f2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74b7332b_f0c3_4e97_9149_0b09e4f74727.slice/crio-96c95db86e4b6ec4ace4e31bef317942a8ef48cfa15c89bd1e039c242da5f224\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cec8b30_efea_4cd5_be6c_889b5dc59008.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce59a384_76dd_444f_8e42_e4eb194e48e9.slice/crio-378d4df83daf1f535eeb0b728a0e01fb36a7c98f7f8e3fe7fd464bd74687a9a2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74b7332b_f0c3_4e97_9149_0b09e4f74727.slice\": RecentStats: unable to find data in memory cache]" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.570815 4865 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7733d2ef-e9ed-4977-b294-3941be6b9455-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:33 crc kubenswrapper[4865]: I1205 06:13:33.570876 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsxwv\" (UniqueName: \"kubernetes.io/projected/7733d2ef-e9ed-4977-b294-3941be6b9455-kube-api-access-nsxwv\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:34 crc kubenswrapper[4865]: I1205 06:13:34.011441 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f091-account-create-update-c4wdz" event={"ID":"7733d2ef-e9ed-4977-b294-3941be6b9455","Type":"ContainerDied","Data":"458a8084261d629a73a02079a1d50e4cb171ed6c7ffcaa16654ed3796474d327"} Dec 05 06:13:34 crc kubenswrapper[4865]: I1205 06:13:34.011738 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458a8084261d629a73a02079a1d50e4cb171ed6c7ffcaa16654ed3796474d327" Dec 05 06:13:34 crc kubenswrapper[4865]: I1205 06:13:34.011851 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f091-account-create-update-c4wdz" Dec 05 06:13:34 crc kubenswrapper[4865]: I1205 06:13:34.027879 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerStarted","Data":"8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2"} Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.040052 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerStarted","Data":"7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f"} Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.040443 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.040444 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-central-agent" containerID="cri-o://31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77" gracePeriod=30 Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.040506 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="sg-core" containerID="cri-o://8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2" gracePeriod=30 Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.040565 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-notification-agent" containerID="cri-o://4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee" gracePeriod=30 Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.040573 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="proxy-httpd" containerID="cri-o://7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f" gracePeriod=30 Dec 05 06:13:35 crc kubenswrapper[4865]: I1205 06:13:35.079784 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7040315010000002 podStartE2EDuration="6.07974568s" podCreationTimestamp="2025-12-05 06:13:29 +0000 UTC" firstStartedPulling="2025-12-05 06:13:30.991708055 +0000 UTC m=+1230.271719267" lastFinishedPulling="2025-12-05 06:13:34.367422224 +0000 UTC m=+1233.647433446" observedRunningTime="2025-12-05 06:13:35.071478425 +0000 UTC m=+1234.351489647" watchObservedRunningTime="2025-12-05 06:13:35.07974568 +0000 UTC m=+1234.359756902" Dec 05 06:13:36 crc kubenswrapper[4865]: I1205 06:13:36.054594 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerID="7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f" exitCode=0 Dec 05 06:13:36 crc kubenswrapper[4865]: I1205 06:13:36.054918 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerID="8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2" exitCode=2 Dec 05 06:13:36 crc kubenswrapper[4865]: I1205 06:13:36.054926 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerID="4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee" exitCode=0 Dec 05 06:13:36 crc kubenswrapper[4865]: I1205 06:13:36.054948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerDied","Data":"7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f"} Dec 05 06:13:36 crc kubenswrapper[4865]: I1205 06:13:36.054974 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerDied","Data":"8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2"} Dec 05 06:13:36 crc kubenswrapper[4865]: I1205 06:13:36.054983 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerDied","Data":"4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee"} Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.169737 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-g2g6p"] Dec 05 06:13:37 crc kubenswrapper[4865]: E1205 06:13:37.170221 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7733d2ef-e9ed-4977-b294-3941be6b9455" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170242 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="7733d2ef-e9ed-4977-b294-3941be6b9455" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: E1205 06:13:37.170261 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce59a384-76dd-444f-8e42-e4eb194e48e9" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170269 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce59a384-76dd-444f-8e42-e4eb194e48e9" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: E1205 06:13:37.170278 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8240c19-58ea-4d42-ac99-121c7f01e2f2" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170286 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8240c19-58ea-4d42-ac99-121c7f01e2f2" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: E1205 06:13:37.170305 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b7332b-f0c3-4e97-9149-0b09e4f74727" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170313 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b7332b-f0c3-4e97-9149-0b09e4f74727" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: E1205 06:13:37.170326 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cec8b30-efea-4cd5-be6c-889b5dc59008" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170334 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cec8b30-efea-4cd5-be6c-889b5dc59008" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: E1205 06:13:37.170353 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519c901d-31cc-4600-96bb-66ecd95aba90" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170360 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="519c901d-31cc-4600-96bb-66ecd95aba90" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170572 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b7332b-f0c3-4e97-9149-0b09e4f74727" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170587 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cec8b30-efea-4cd5-be6c-889b5dc59008" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170596 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce59a384-76dd-444f-8e42-e4eb194e48e9" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170614 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8240c19-58ea-4d42-ac99-121c7f01e2f2" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170634 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="519c901d-31cc-4600-96bb-66ecd95aba90" containerName="mariadb-database-create" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.170645 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="7733d2ef-e9ed-4977-b294-3941be6b9455" containerName="mariadb-account-create-update" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.171321 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.176543 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.176916 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.177029 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vrbg2" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.194377 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-g2g6p"] Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.342082 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgdh4\" (UniqueName: \"kubernetes.io/projected/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-kube-api-access-jgdh4\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.342377 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-scripts\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.342446 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-config-data\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.342574 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.444592 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgdh4\" (UniqueName: \"kubernetes.io/projected/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-kube-api-access-jgdh4\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.444709 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-scripts\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.444802 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-config-data\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.444890 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.450619 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.463064 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-scripts\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.467092 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-config-data\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.471702 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgdh4\" (UniqueName: \"kubernetes.io/projected/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-kube-api-access-jgdh4\") pod \"nova-cell0-conductor-db-sync-g2g6p\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.492395 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.818018 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.958568 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-scripts\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.958642 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-combined-ca-bundle\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.958733 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-run-httpd\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.958802 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-config-data\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.958858 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5ttt\" (UniqueName: \"kubernetes.io/projected/0f6d527d-88fb-4cce-becf-d744ce4cc27c-kube-api-access-p5ttt\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.959144 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-log-httpd\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.959208 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-sg-core-conf-yaml\") pod \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\" (UID: \"0f6d527d-88fb-4cce-becf-d744ce4cc27c\") " Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.960353 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.960727 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.988001 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6d527d-88fb-4cce-becf-d744ce4cc27c-kube-api-access-p5ttt" (OuterVolumeSpecName: "kube-api-access-p5ttt") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "kube-api-access-p5ttt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:37 crc kubenswrapper[4865]: I1205 06:13:37.989891 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-scripts" (OuterVolumeSpecName: "scripts") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.019937 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.064166 4865 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.064216 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5ttt\" (UniqueName: \"kubernetes.io/projected/0f6d527d-88fb-4cce-becf-d744ce4cc27c-kube-api-access-p5ttt\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.064229 4865 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f6d527d-88fb-4cce-becf-d744ce4cc27c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.064237 4865 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.064245 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.075580 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerID="31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77" exitCode=0 Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.075642 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerDied","Data":"31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77"} Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.075674 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f6d527d-88fb-4cce-becf-d744ce4cc27c","Type":"ContainerDied","Data":"9c00786eec954fecc17f7117a1f894fbd8f7878aca53880f692297eadd3d31e3"} Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.075691 4865 scope.go:117] "RemoveContainer" containerID="7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.075899 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.096158 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.101660 4865 scope.go:117] "RemoveContainer" containerID="8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.110368 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-config-data" (OuterVolumeSpecName: "config-data") pod "0f6d527d-88fb-4cce-becf-d744ce4cc27c" (UID: "0f6d527d-88fb-4cce-becf-d744ce4cc27c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.119131 4865 scope.go:117] "RemoveContainer" containerID="4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.139106 4865 scope.go:117] "RemoveContainer" containerID="31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.165831 4865 scope.go:117] "RemoveContainer" containerID="7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.166030 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.166051 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6d527d-88fb-4cce-becf-d744ce4cc27c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.167796 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f\": container with ID starting with 7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f not found: ID does not exist" containerID="7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.167851 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f"} err="failed to get container status \"7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f\": rpc error: code = NotFound desc = could not find container \"7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f\": container with ID starting with 7db9e32d69980553434e522a91220a211c60f84daf6d08d184deab18f963438f not found: ID does not exist" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.167877 4865 scope.go:117] "RemoveContainer" containerID="8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.168335 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2\": container with ID starting with 8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2 not found: ID does not exist" containerID="8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.168363 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2"} err="failed to get container status \"8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2\": rpc error: code = NotFound desc = could not find container \"8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2\": container with ID starting with 8ff68a6ebf94e5147f1b78bf1fdbebdd19b4cee58ab92ce6cd4d2ec1e7eb3cf2 not found: ID does not exist" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.168387 4865 scope.go:117] "RemoveContainer" containerID="4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.168777 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee\": container with ID starting with 4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee not found: ID does not exist" containerID="4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.168801 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee"} err="failed to get container status \"4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee\": rpc error: code = NotFound desc = could not find container \"4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee\": container with ID starting with 4e00d2bc486f6c1f8c805b86d156615e36dd1a6d0080a87a1edf456aeb229bee not found: ID does not exist" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.168815 4865 scope.go:117] "RemoveContainer" containerID="31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.169132 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77\": container with ID starting with 31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77 not found: ID does not exist" containerID="31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.169153 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77"} err="failed to get container status \"31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77\": rpc error: code = NotFound desc = could not find container \"31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77\": container with ID starting with 31d22f00e3fe047335adcaa9cd82b714669bf4e7c0b6032b0262034129d3ae77 not found: ID does not exist" Dec 05 06:13:38 crc kubenswrapper[4865]: W1205 06:13:38.172146 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdda782_13a7_4c36_a8f3_7b1d09fd2ca1.slice/crio-89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed WatchSource:0}: Error finding container 89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed: Status 404 returned error can't find the container with id 89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.181101 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-g2g6p"] Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.418030 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.430558 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.459866 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.460345 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="proxy-httpd" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460371 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="proxy-httpd" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.460410 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-central-agent" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460419 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-central-agent" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.460437 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="sg-core" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460445 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="sg-core" Dec 05 06:13:38 crc kubenswrapper[4865]: E1205 06:13:38.460466 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-notification-agent" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460474 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-notification-agent" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460714 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="proxy-httpd" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460744 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="sg-core" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460768 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-central-agent" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.460809 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" containerName="ceilometer-notification-agent" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.462960 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.466532 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.466796 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.514770 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.575486 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d56774dc9-sps89" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589579 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-log-httpd\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589639 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-config-data\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589671 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-scripts\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589720 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htm89\" (UniqueName: \"kubernetes.io/projected/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-kube-api-access-htm89\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589747 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-run-httpd\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589777 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.589797 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.691621 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.691888 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-run-httpd\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.691913 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.692052 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-log-httpd\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.692079 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-config-data\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.692111 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-scripts\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.692149 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htm89\" (UniqueName: \"kubernetes.io/projected/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-kube-api-access-htm89\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.693069 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-log-httpd\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.693592 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-run-httpd\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.698500 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.701091 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.701628 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-config-data\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.710849 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-scripts\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.721671 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htm89\" (UniqueName: \"kubernetes.io/projected/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-kube-api-access-htm89\") pod \"ceilometer-0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " pod="openstack/ceilometer-0" Dec 05 06:13:38 crc kubenswrapper[4865]: I1205 06:13:38.790973 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:13:39 crc kubenswrapper[4865]: I1205 06:13:39.026393 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6d527d-88fb-4cce-becf-d744ce4cc27c" path="/var/lib/kubelet/pods/0f6d527d-88fb-4cce-becf-d744ce4cc27c/volumes" Dec 05 06:13:39 crc kubenswrapper[4865]: I1205 06:13:39.157632 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" event={"ID":"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1","Type":"ContainerStarted","Data":"89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed"} Dec 05 06:13:39 crc kubenswrapper[4865]: I1205 06:13:39.287191 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:39 crc kubenswrapper[4865]: W1205 06:13:39.294722 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8980941_4bc7_4ea0_a7cf_e6e3a03d02f0.slice/crio-cf1359be45a9d6d7b062f5c4d3877be46e051cd13d5dc4ad86346d95102ad6c5 WatchSource:0}: Error finding container cf1359be45a9d6d7b062f5c4d3877be46e051cd13d5dc4ad86346d95102ad6c5: Status 404 returned error can't find the container with id cf1359be45a9d6d7b062f5c4d3877be46e051cd13d5dc4ad86346d95102ad6c5 Dec 05 06:13:39 crc kubenswrapper[4865]: I1205 06:13:39.691955 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:13:40 crc kubenswrapper[4865]: I1205 06:13:40.175678 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerStarted","Data":"a854569bfe5f64034de0c2f46c6f057011a89690fd27a4e079b96c9689098f19"} Dec 05 06:13:40 crc kubenswrapper[4865]: I1205 06:13:40.175998 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerStarted","Data":"cf1359be45a9d6d7b062f5c4d3877be46e051cd13d5dc4ad86346d95102ad6c5"} Dec 05 06:13:41 crc kubenswrapper[4865]: I1205 06:13:41.157109 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:13:41 crc kubenswrapper[4865]: I1205 06:13:41.157396 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:13:41 crc kubenswrapper[4865]: I1205 06:13:41.159975 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:13:41 crc kubenswrapper[4865]: I1205 06:13:41.298654 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:13:41 crc kubenswrapper[4865]: I1205 06:13:41.298725 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:13:41 crc kubenswrapper[4865]: I1205 06:13:41.300476 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78c59b79fd-5jlv4" podUID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 06:13:42 crc kubenswrapper[4865]: I1205 06:13:42.203291 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerStarted","Data":"4626a8d95506322cf22624e542215549495418a13ace2b6e864aa9dd319b2b1a"} Dec 05 06:13:47 crc kubenswrapper[4865]: I1205 06:13:47.690773 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:13:47 crc kubenswrapper[4865]: I1205 06:13:47.692775 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-log" containerID="cri-o://55cac1ca6d6fd7d44cca6d53a391953f66a7a4cbb953929e5d427c25c991fae1" gracePeriod=30 Dec 05 06:13:47 crc kubenswrapper[4865]: I1205 06:13:47.692876 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-httpd" containerID="cri-o://899e2f12ffb610d91c186ad3ed88dffe94d5245e9912c5f90b2d8c33d6271291" gracePeriod=30 Dec 05 06:13:48 crc kubenswrapper[4865]: I1205 06:13:48.268289 4865 generic.go:334] "Generic (PLEG): container finished" podID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerID="55cac1ca6d6fd7d44cca6d53a391953f66a7a4cbb953929e5d427c25c991fae1" exitCode=143 Dec 05 06:13:48 crc kubenswrapper[4865]: I1205 06:13:48.268381 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a7a9002-45f0-4787-a5f0-d1dafdb275d2","Type":"ContainerDied","Data":"55cac1ca6d6fd7d44cca6d53a391953f66a7a4cbb953929e5d427c25c991fae1"} Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.157539 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.299346 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78c59b79fd-5jlv4" podUID="0b2dbfc6-6978-4613-a307-d4d4b4b88bc9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.330980 4865 generic.go:334] "Generic (PLEG): container finished" podID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerID="899e2f12ffb610d91c186ad3ed88dffe94d5245e9912c5f90b2d8c33d6271291" exitCode=0 Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.331076 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a7a9002-45f0-4787-a5f0-d1dafdb275d2","Type":"ContainerDied","Data":"899e2f12ffb610d91c186ad3ed88dffe94d5245e9912c5f90b2d8c33d6271291"} Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.336080 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" event={"ID":"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1","Type":"ContainerStarted","Data":"6870896ac2ef3a4699b002484f042cdd44757cdd6a9115f2c877d993d650e74e"} Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.339926 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerStarted","Data":"f6fe61521ad90a16156e49ed8286e05af7595598d5d2cc24e1be23f06621ac62"} Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.363172 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" podStartSLOduration=2.206627741 podStartE2EDuration="14.363148184s" podCreationTimestamp="2025-12-05 06:13:37 +0000 UTC" firstStartedPulling="2025-12-05 06:13:38.174656216 +0000 UTC m=+1237.454667438" lastFinishedPulling="2025-12-05 06:13:50.331176659 +0000 UTC m=+1249.611187881" observedRunningTime="2025-12-05 06:13:51.351877594 +0000 UTC m=+1250.631888816" watchObservedRunningTime="2025-12-05 06:13:51.363148184 +0000 UTC m=+1250.643159406" Dec 05 06:13:51 crc kubenswrapper[4865]: I1205 06:13:51.825339 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.020282 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkzq\" (UniqueName: \"kubernetes.io/projected/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-kube-api-access-2pkzq\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.020912 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-internal-tls-certs\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021030 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-httpd-run\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021156 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-scripts\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021246 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-logs\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021383 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021529 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-config-data\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021663 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021558 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-logs" (OuterVolumeSpecName: "logs") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.021851 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-combined-ca-bundle\") pod \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\" (UID: \"2a7a9002-45f0-4787-a5f0-d1dafdb275d2\") " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.022476 4865 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.022565 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.034529 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-scripts" (OuterVolumeSpecName: "scripts") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.035402 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-kube-api-access-2pkzq" (OuterVolumeSpecName: "kube-api-access-2pkzq") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "kube-api-access-2pkzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.044541 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.092230 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.125621 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.126952 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.127077 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkzq\" (UniqueName: \"kubernetes.io/projected/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-kube-api-access-2pkzq\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.127155 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.150230 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.150918 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.162932 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-config-data" (OuterVolumeSpecName: "config-data") pod "2a7a9002-45f0-4787-a5f0-d1dafdb275d2" (UID: "2a7a9002-45f0-4787-a5f0-d1dafdb275d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.230217 4865 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.230265 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a9002-45f0-4787-a5f0-d1dafdb275d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.230277 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.357372 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerStarted","Data":"bf5eb3c540bf6f4d1c2cf3369e46010a3c5b8373292bec7ce126cccaca6796e4"} Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.357535 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-central-agent" containerID="cri-o://a854569bfe5f64034de0c2f46c6f057011a89690fd27a4e079b96c9689098f19" gracePeriod=30 Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.357606 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.357621 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="proxy-httpd" containerID="cri-o://bf5eb3c540bf6f4d1c2cf3369e46010a3c5b8373292bec7ce126cccaca6796e4" gracePeriod=30 Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.357658 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="sg-core" containerID="cri-o://f6fe61521ad90a16156e49ed8286e05af7595598d5d2cc24e1be23f06621ac62" gracePeriod=30 Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.357685 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-notification-agent" containerID="cri-o://4626a8d95506322cf22624e542215549495418a13ace2b6e864aa9dd319b2b1a" gracePeriod=30 Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.374137 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.374254 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a7a9002-45f0-4787-a5f0-d1dafdb275d2","Type":"ContainerDied","Data":"b7db5c358398a6e320934e5cff598b051c3d63483c68acd4cedcff359b3709a2"} Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.374308 4865 scope.go:117] "RemoveContainer" containerID="899e2f12ffb610d91c186ad3ed88dffe94d5245e9912c5f90b2d8c33d6271291" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.393352 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.068878816 podStartE2EDuration="14.393326899s" podCreationTimestamp="2025-12-05 06:13:38 +0000 UTC" firstStartedPulling="2025-12-05 06:13:39.296562451 +0000 UTC m=+1238.576573673" lastFinishedPulling="2025-12-05 06:13:51.621010544 +0000 UTC m=+1250.901021756" observedRunningTime="2025-12-05 06:13:52.385385412 +0000 UTC m=+1251.665396634" watchObservedRunningTime="2025-12-05 06:13:52.393326899 +0000 UTC m=+1251.673338121" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.419891 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.430128 4865 scope.go:117] "RemoveContainer" containerID="55cac1ca6d6fd7d44cca6d53a391953f66a7a4cbb953929e5d427c25c991fae1" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.431868 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.467534 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:13:52 crc kubenswrapper[4865]: E1205 06:13:52.467995 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-httpd" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.468012 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-httpd" Dec 05 06:13:52 crc kubenswrapper[4865]: E1205 06:13:52.468044 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-log" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.468050 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-log" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.468223 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-log" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.468245 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" containerName="glance-httpd" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.469284 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.471860 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.472755 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.515257 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.636716 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e54c5fe9-12f5-40a2-a472-249d1510d49c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637283 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637386 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxs6h\" (UniqueName: \"kubernetes.io/projected/e54c5fe9-12f5-40a2-a472-249d1510d49c-kube-api-access-qxs6h\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637457 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54c5fe9-12f5-40a2-a472-249d1510d49c-logs\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637529 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637622 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637737 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.637838 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.739087 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.739928 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.739985 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e54c5fe9-12f5-40a2-a472-249d1510d49c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740132 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740296 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxs6h\" (UniqueName: \"kubernetes.io/projected/e54c5fe9-12f5-40a2-a472-249d1510d49c-kube-api-access-qxs6h\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740537 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e54c5fe9-12f5-40a2-a472-249d1510d49c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740546 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54c5fe9-12f5-40a2-a472-249d1510d49c-logs\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740609 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740693 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.740810 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54c5fe9-12f5-40a2-a472-249d1510d49c-logs\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.741182 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.754735 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.755236 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.757435 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.768235 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54c5fe9-12f5-40a2-a472-249d1510d49c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.773414 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxs6h\" (UniqueName: \"kubernetes.io/projected/e54c5fe9-12f5-40a2-a472-249d1510d49c-kube-api-access-qxs6h\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.789208 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e54c5fe9-12f5-40a2-a472-249d1510d49c\") " pod="openstack/glance-default-internal-api-0" Dec 05 06:13:52 crc kubenswrapper[4865]: I1205 06:13:52.858359 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 06:13:53 crc kubenswrapper[4865]: I1205 06:13:53.017941 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7a9002-45f0-4787-a5f0-d1dafdb275d2" path="/var/lib/kubelet/pods/2a7a9002-45f0-4787-a5f0-d1dafdb275d2/volumes" Dec 05 06:13:53 crc kubenswrapper[4865]: I1205 06:13:53.388893 4865 generic.go:334] "Generic (PLEG): container finished" podID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerID="f6fe61521ad90a16156e49ed8286e05af7595598d5d2cc24e1be23f06621ac62" exitCode=2 Dec 05 06:13:53 crc kubenswrapper[4865]: I1205 06:13:53.388990 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerDied","Data":"f6fe61521ad90a16156e49ed8286e05af7595598d5d2cc24e1be23f06621ac62"} Dec 05 06:13:53 crc kubenswrapper[4865]: W1205 06:13:53.828413 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode54c5fe9_12f5_40a2_a472_249d1510d49c.slice/crio-734a02790c895cc00e23a1b87f5fec4f90432a063f85cfe21d40fda4fde33825 WatchSource:0}: Error finding container 734a02790c895cc00e23a1b87f5fec4f90432a063f85cfe21d40fda4fde33825: Status 404 returned error can't find the container with id 734a02790c895cc00e23a1b87f5fec4f90432a063f85cfe21d40fda4fde33825 Dec 05 06:13:53 crc kubenswrapper[4865]: I1205 06:13:53.841737 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 06:13:54 crc kubenswrapper[4865]: E1205 06:13:54.131787 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8980941_4bc7_4ea0_a7cf_e6e3a03d02f0.slice/crio-conmon-4626a8d95506322cf22624e542215549495418a13ace2b6e864aa9dd319b2b1a.scope\": RecentStats: unable to find data in memory cache]" Dec 05 06:13:54 crc kubenswrapper[4865]: I1205 06:13:54.412263 4865 generic.go:334] "Generic (PLEG): container finished" podID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerID="4626a8d95506322cf22624e542215549495418a13ace2b6e864aa9dd319b2b1a" exitCode=0 Dec 05 06:13:54 crc kubenswrapper[4865]: I1205 06:13:54.412294 4865 generic.go:334] "Generic (PLEG): container finished" podID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerID="a854569bfe5f64034de0c2f46c6f057011a89690fd27a4e079b96c9689098f19" exitCode=0 Dec 05 06:13:54 crc kubenswrapper[4865]: I1205 06:13:54.412342 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerDied","Data":"4626a8d95506322cf22624e542215549495418a13ace2b6e864aa9dd319b2b1a"} Dec 05 06:13:54 crc kubenswrapper[4865]: I1205 06:13:54.412429 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerDied","Data":"a854569bfe5f64034de0c2f46c6f057011a89690fd27a4e079b96c9689098f19"} Dec 05 06:13:54 crc kubenswrapper[4865]: I1205 06:13:54.413363 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e54c5fe9-12f5-40a2-a472-249d1510d49c","Type":"ContainerStarted","Data":"734a02790c895cc00e23a1b87f5fec4f90432a063f85cfe21d40fda4fde33825"} Dec 05 06:13:55 crc kubenswrapper[4865]: I1205 06:13:55.453489 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e54c5fe9-12f5-40a2-a472-249d1510d49c","Type":"ContainerStarted","Data":"b51225b3bc531a610ee09f720d5a9f4a9a317fa65f97d9db6f39c4e6f5d912a5"} Dec 05 06:13:55 crc kubenswrapper[4865]: I1205 06:13:55.454027 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e54c5fe9-12f5-40a2-a472-249d1510d49c","Type":"ContainerStarted","Data":"87e2e814a0a2b7b22f0d3f175fd58334e525751ac2385a1d559d797efc57a98b"} Dec 05 06:13:55 crc kubenswrapper[4865]: I1205 06:13:55.477366 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.477342084 podStartE2EDuration="3.477342084s" podCreationTimestamp="2025-12-05 06:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:13:55.474623207 +0000 UTC m=+1254.754634429" watchObservedRunningTime="2025-12-05 06:13:55.477342084 +0000 UTC m=+1254.757353306" Dec 05 06:13:57 crc kubenswrapper[4865]: I1205 06:13:57.096437 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:13:57 crc kubenswrapper[4865]: I1205 06:13:57.097045 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-log" containerID="cri-o://645f42338fde723343788844eabcb62f269953b6fb35812c65fe83b79a7c282a" gracePeriod=30 Dec 05 06:13:57 crc kubenswrapper[4865]: I1205 06:13:57.097523 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-httpd" containerID="cri-o://23a7bde9ff9e34c33448cbbc01f9a8a20ddaed84638f360243a30c3f4643b9b0" gracePeriod=30 Dec 05 06:13:57 crc kubenswrapper[4865]: I1205 06:13:57.472599 4865 generic.go:334] "Generic (PLEG): container finished" podID="dc8877b8-5bcf-45b4-b224-755711b47627" containerID="645f42338fde723343788844eabcb62f269953b6fb35812c65fe83b79a7c282a" exitCode=143 Dec 05 06:13:57 crc kubenswrapper[4865]: I1205 06:13:57.472643 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc8877b8-5bcf-45b4-b224-755711b47627","Type":"ContainerDied","Data":"645f42338fde723343788844eabcb62f269953b6fb35812c65fe83b79a7c282a"} Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.508353 4865 generic.go:334] "Generic (PLEG): container finished" podID="dc8877b8-5bcf-45b4-b224-755711b47627" containerID="23a7bde9ff9e34c33448cbbc01f9a8a20ddaed84638f360243a30c3f4643b9b0" exitCode=0 Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.508437 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc8877b8-5bcf-45b4-b224-755711b47627","Type":"ContainerDied","Data":"23a7bde9ff9e34c33448cbbc01f9a8a20ddaed84638f360243a30c3f4643b9b0"} Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.755997 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829633 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27k2w\" (UniqueName: \"kubernetes.io/projected/dc8877b8-5bcf-45b4-b224-755711b47627-kube-api-access-27k2w\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829760 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-scripts\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829783 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-public-tls-certs\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829802 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-httpd-run\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829881 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-combined-ca-bundle\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829925 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-config-data\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.829979 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-logs\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.830003 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"dc8877b8-5bcf-45b4-b224-755711b47627\" (UID: \"dc8877b8-5bcf-45b4-b224-755711b47627\") " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.830590 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-logs" (OuterVolumeSpecName: "logs") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.830611 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.845972 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.848218 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8877b8-5bcf-45b4-b224-755711b47627-kube-api-access-27k2w" (OuterVolumeSpecName: "kube-api-access-27k2w") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "kube-api-access-27k2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.870048 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-scripts" (OuterVolumeSpecName: "scripts") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.892477 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.895691 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.908341 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-config-data" (OuterVolumeSpecName: "config-data") pod "dc8877b8-5bcf-45b4-b224-755711b47627" (UID: "dc8877b8-5bcf-45b4-b224-755711b47627"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932384 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932431 4865 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932441 4865 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932453 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932462 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8877b8-5bcf-45b4-b224-755711b47627-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932474 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc8877b8-5bcf-45b4-b224-755711b47627-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932512 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.932521 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27k2w\" (UniqueName: \"kubernetes.io/projected/dc8877b8-5bcf-45b4-b224-755711b47627-kube-api-access-27k2w\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:00 crc kubenswrapper[4865]: I1205 06:14:00.956582 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.056023 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.520511 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc8877b8-5bcf-45b4-b224-755711b47627","Type":"ContainerDied","Data":"e08149c4c3bf4457e46d419decfc6f51437ed7155c1173db523079a3fd41582a"} Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.520610 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.520811 4865 scope.go:117] "RemoveContainer" containerID="23a7bde9ff9e34c33448cbbc01f9a8a20ddaed84638f360243a30c3f4643b9b0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.556124 4865 scope.go:117] "RemoveContainer" containerID="645f42338fde723343788844eabcb62f269953b6fb35812c65fe83b79a7c282a" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.560669 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.571205 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.592574 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:14:01 crc kubenswrapper[4865]: E1205 06:14:01.592978 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-log" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.592996 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-log" Dec 05 06:14:01 crc kubenswrapper[4865]: E1205 06:14:01.593034 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-httpd" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.593041 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-httpd" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.593212 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-log" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.593239 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" containerName="glance-httpd" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.594281 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.599377 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.599667 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.610375 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.774583 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.774633 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.774745 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5aac6f-3ab8-412a-92f3-6102f9b75238-logs\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.774949 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.775053 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrr7d\" (UniqueName: \"kubernetes.io/projected/bf5aac6f-3ab8-412a-92f3-6102f9b75238-kube-api-access-rrr7d\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.775124 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.775292 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5aac6f-3ab8-412a-92f3-6102f9b75238-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.775374 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877402 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5aac6f-3ab8-412a-92f3-6102f9b75238-logs\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877478 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877523 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrr7d\" (UniqueName: \"kubernetes.io/projected/bf5aac6f-3ab8-412a-92f3-6102f9b75238-kube-api-access-rrr7d\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877541 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877596 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5aac6f-3ab8-412a-92f3-6102f9b75238-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877629 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877648 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877677 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.877996 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.878095 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5aac6f-3ab8-412a-92f3-6102f9b75238-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.878666 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5aac6f-3ab8-412a-92f3-6102f9b75238-logs\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.886250 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.886374 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.901803 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.902883 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5aac6f-3ab8-412a-92f3-6102f9b75238-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.906591 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrr7d\" (UniqueName: \"kubernetes.io/projected/bf5aac6f-3ab8-412a-92f3-6102f9b75238-kube-api-access-rrr7d\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:01 crc kubenswrapper[4865]: I1205 06:14:01.943193 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf5aac6f-3ab8-412a-92f3-6102f9b75238\") " pod="openstack/glance-default-external-api-0" Dec 05 06:14:02 crc kubenswrapper[4865]: I1205 06:14:02.222167 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 06:14:02 crc kubenswrapper[4865]: I1205 06:14:02.859404 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:02 crc kubenswrapper[4865]: I1205 06:14:02.860008 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:02 crc kubenswrapper[4865]: I1205 06:14:02.906224 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:02 crc kubenswrapper[4865]: I1205 06:14:02.912393 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:02 crc kubenswrapper[4865]: I1205 06:14:02.944379 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 06:14:03 crc kubenswrapper[4865]: I1205 06:14:03.021030 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8877b8-5bcf-45b4-b224-755711b47627" path="/var/lib/kubelet/pods/dc8877b8-5bcf-45b4-b224-755711b47627/volumes" Dec 05 06:14:03 crc kubenswrapper[4865]: I1205 06:14:03.539684 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5aac6f-3ab8-412a-92f3-6102f9b75238","Type":"ContainerStarted","Data":"a97b46500c6d0c19e3cfb9df593282d5580f13ccb8c18bb9ed78bf4e79aa0b8c"} Dec 05 06:14:03 crc kubenswrapper[4865]: I1205 06:14:03.539980 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:03 crc kubenswrapper[4865]: I1205 06:14:03.539996 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:04 crc kubenswrapper[4865]: I1205 06:14:04.314494 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:14:04 crc kubenswrapper[4865]: I1205 06:14:04.461999 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:14:04 crc kubenswrapper[4865]: I1205 06:14:04.550835 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5aac6f-3ab8-412a-92f3-6102f9b75238","Type":"ContainerStarted","Data":"2108b17d8742446091d8d4da7701e1e1c8ef12e00bdc1ba362e3db9feace88c4"} Dec 05 06:14:04 crc kubenswrapper[4865]: I1205 06:14:04.550873 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5aac6f-3ab8-412a-92f3-6102f9b75238","Type":"ContainerStarted","Data":"2933907305abad12d888099934cabbf88d7994b4918823a6bd4e2453c7e4a12c"} Dec 05 06:14:04 crc kubenswrapper[4865]: I1205 06:14:04.576070 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.576048627 podStartE2EDuration="3.576048627s" podCreationTimestamp="2025-12-05 06:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:04.569840781 +0000 UTC m=+1263.849852003" watchObservedRunningTime="2025-12-05 06:14:04.576048627 +0000 UTC m=+1263.856059849" Dec 05 06:14:05 crc kubenswrapper[4865]: I1205 06:14:05.562484 4865 generic.go:334] "Generic (PLEG): container finished" podID="9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" containerID="6870896ac2ef3a4699b002484f042cdd44757cdd6a9115f2c877d993d650e74e" exitCode=0 Dec 05 06:14:05 crc kubenswrapper[4865]: I1205 06:14:05.562567 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" event={"ID":"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1","Type":"ContainerDied","Data":"6870896ac2ef3a4699b002484f042cdd44757cdd6a9115f2c877d993d650e74e"} Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.059418 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.059513 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.197870 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.396460 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.473179 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-78c59b79fd-5jlv4" Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.544966 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd68dd9b8-z62zt"] Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.572310 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon-log" containerID="cri-o://ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f" gracePeriod=30 Dec 05 06:14:06 crc kubenswrapper[4865]: I1205 06:14:06.572866 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" containerID="cri-o://ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217" gracePeriod=30 Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.039546 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.083074 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-config-data\") pod \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.083466 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-scripts\") pod \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.083670 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgdh4\" (UniqueName: \"kubernetes.io/projected/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-kube-api-access-jgdh4\") pod \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.083768 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-combined-ca-bundle\") pod \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\" (UID: \"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1\") " Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.105664 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-scripts" (OuterVolumeSpecName: "scripts") pod "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" (UID: "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.112682 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-kube-api-access-jgdh4" (OuterVolumeSpecName: "kube-api-access-jgdh4") pod "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" (UID: "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1"). InnerVolumeSpecName "kube-api-access-jgdh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.150437 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-config-data" (OuterVolumeSpecName: "config-data") pod "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" (UID: "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.182733 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" (UID: "9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.185611 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.185651 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.185661 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.185670 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgdh4\" (UniqueName: \"kubernetes.io/projected/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1-kube-api-access-jgdh4\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.603675 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" event={"ID":"9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1","Type":"ContainerDied","Data":"89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed"} Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.603794 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.603919 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-g2g6p" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.695928 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 06:14:07 crc kubenswrapper[4865]: E1205 06:14:07.696401 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" containerName="nova-cell0-conductor-db-sync" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.696415 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" containerName="nova-cell0-conductor-db-sync" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.696624 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" containerName="nova-cell0-conductor-db-sync" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.697358 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.703568 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vrbg2" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.703741 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.733024 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.799079 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.799177 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.799245 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtr8\" (UniqueName: \"kubernetes.io/projected/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-kube-api-access-trtr8\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.900909 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.901298 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.901357 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trtr8\" (UniqueName: \"kubernetes.io/projected/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-kube-api-access-trtr8\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.907575 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.907808 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:07 crc kubenswrapper[4865]: I1205 06:14:07.925003 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trtr8\" (UniqueName: \"kubernetes.io/projected/5d470cf8-c2ca-4bc1-ab26-d8762af687d1-kube-api-access-trtr8\") pod \"nova-cell0-conductor-0\" (UID: \"5d470cf8-c2ca-4bc1-ab26-d8762af687d1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:08 crc kubenswrapper[4865]: I1205 06:14:08.037154 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:08 crc kubenswrapper[4865]: I1205 06:14:08.596350 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 06:14:08 crc kubenswrapper[4865]: I1205 06:14:08.628644 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5d470cf8-c2ca-4bc1-ab26-d8762af687d1","Type":"ContainerStarted","Data":"aa59e138516dd9a9bd2e311cf584c45c217debde2c63b4732393b3ef167c89e0"} Dec 05 06:14:08 crc kubenswrapper[4865]: I1205 06:14:08.803913 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 06:14:09 crc kubenswrapper[4865]: I1205 06:14:09.639162 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5d470cf8-c2ca-4bc1-ab26-d8762af687d1","Type":"ContainerStarted","Data":"9eac57d7d56406964f768b1051721f7ab76c1a27c0426fac9c38bc7c8dda3cce"} Dec 05 06:14:09 crc kubenswrapper[4865]: I1205 06:14:09.639312 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:09 crc kubenswrapper[4865]: I1205 06:14:09.664748 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.664724425 podStartE2EDuration="2.664724425s" podCreationTimestamp="2025-12-05 06:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:09.658951081 +0000 UTC m=+1268.938962303" watchObservedRunningTime="2025-12-05 06:14:09.664724425 +0000 UTC m=+1268.944735647" Dec 05 06:14:10 crc kubenswrapper[4865]: I1205 06:14:10.653908 4865 generic.go:334] "Generic (PLEG): container finished" podID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerID="ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217" exitCode=0 Dec 05 06:14:10 crc kubenswrapper[4865]: I1205 06:14:10.654056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerDied","Data":"ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217"} Dec 05 06:14:10 crc kubenswrapper[4865]: I1205 06:14:10.654502 4865 scope.go:117] "RemoveContainer" containerID="ae9988b24b0cc529f27a61e58c049c77ec8edcedb21946f5111a11587be650d1" Dec 05 06:14:11 crc kubenswrapper[4865]: I1205 06:14:11.156452 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:14:12 crc kubenswrapper[4865]: I1205 06:14:12.223025 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 06:14:12 crc kubenswrapper[4865]: I1205 06:14:12.223099 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 06:14:12 crc kubenswrapper[4865]: I1205 06:14:12.257606 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 06:14:12 crc kubenswrapper[4865]: I1205 06:14:12.277093 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 06:14:12 crc kubenswrapper[4865]: I1205 06:14:12.674577 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 06:14:12 crc kubenswrapper[4865]: I1205 06:14:12.674872 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.086740 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.560009 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lt8qn"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.561860 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.569604 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.569881 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.569629 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lt8qn"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.713685 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7bz\" (UniqueName: \"kubernetes.io/projected/67260032-55e2-4709-84b1-577259ffa891-kube-api-access-6f7bz\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.713752 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-scripts\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.713798 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.713892 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.762183 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.763711 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.781096 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.815325 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.815434 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7bz\" (UniqueName: \"kubernetes.io/projected/67260032-55e2-4709-84b1-577259ffa891-kube-api-access-6f7bz\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.815483 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-scripts\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.815541 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.817418 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.820144 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.823910 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-scripts\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.828271 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.831736 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.845590 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.885515 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7bz\" (UniqueName: \"kubernetes.io/projected/67260032-55e2-4709-84b1-577259ffa891-kube-api-access-6f7bz\") pod \"nova-cell0-cell-mapping-lt8qn\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.885588 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.904457 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.920866 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c81d161-71fa-4b7d-b386-51c0eb914cb4-logs\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.920921 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.920961 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-config-data\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.921014 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5vr\" (UniqueName: \"kubernetes.io/projected/4c81d161-71fa-4b7d-b386-51c0eb914cb4-kube-api-access-9t5vr\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.921036 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6bv\" (UniqueName: \"kubernetes.io/projected/ce9f6f7e-e815-4cad-a620-08d913f360ec-kube-api-access-ls6bv\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.921070 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.921149 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-config-data\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.945314 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.993439 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:13 crc kubenswrapper[4865]: I1205 06:14:13.996487 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.001904 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.024744 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c81d161-71fa-4b7d-b386-51c0eb914cb4-logs\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.024793 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.024837 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-config-data\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.024893 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5vr\" (UniqueName: \"kubernetes.io/projected/4c81d161-71fa-4b7d-b386-51c0eb914cb4-kube-api-access-9t5vr\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.024914 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6bv\" (UniqueName: \"kubernetes.io/projected/ce9f6f7e-e815-4cad-a620-08d913f360ec-kube-api-access-ls6bv\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.024951 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.025011 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-config-data\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.027149 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c81d161-71fa-4b7d-b386-51c0eb914cb4-logs\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.033313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-config-data\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.042593 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-config-data\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.048649 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.053903 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.055316 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.059469 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.059813 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.082709 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.099943 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.100634 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5vr\" (UniqueName: \"kubernetes.io/projected/4c81d161-71fa-4b7d-b386-51c0eb914cb4-kube-api-access-9t5vr\") pod \"nova-api-0\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.106893 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6bv\" (UniqueName: \"kubernetes.io/projected/ce9f6f7e-e815-4cad-a620-08d913f360ec-kube-api-access-ls6bv\") pod \"nova-scheduler-0\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.124011 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131199 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131235 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwfn\" (UniqueName: \"kubernetes.io/projected/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-kube-api-access-btwfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131276 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-logs\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131318 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7lts\" (UniqueName: \"kubernetes.io/projected/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-kube-api-access-t7lts\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131355 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131373 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-config-data\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.131424 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.240203 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7lts\" (UniqueName: \"kubernetes.io/projected/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-kube-api-access-t7lts\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.241365 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.241528 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-config-data\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.242010 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.242896 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.242924 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwfn\" (UniqueName: \"kubernetes.io/projected/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-kube-api-access-btwfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.243080 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-logs\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.246085 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-logs\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.247783 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.253999 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-config-data\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.254625 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.272754 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.279089 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7lts\" (UniqueName: \"kubernetes.io/projected/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-kube-api-access-t7lts\") pod \"nova-metadata-0\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.279795 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwfn\" (UniqueName: \"kubernetes.io/projected/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-kube-api-access-btwfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.342067 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8pvjj"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.354132 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.361977 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8pvjj"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.382418 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.447640 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.459495 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.460267 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.460311 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fz2\" (UniqueName: \"kubernetes.io/projected/929b303b-d676-4548-9186-c29a7921cb8d-kube-api-access-46fz2\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.460398 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-config\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.460471 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.460487 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.460540 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.562649 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-config\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.562721 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.562738 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.562804 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.562867 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.562887 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fz2\" (UniqueName: \"kubernetes.io/projected/929b303b-d676-4548-9186-c29a7921cb8d-kube-api-access-46fz2\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.564159 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-config\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.567402 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.567958 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.567970 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.568471 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.588576 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fz2\" (UniqueName: \"kubernetes.io/projected/929b303b-d676-4548-9186-c29a7921cb8d-kube-api-access-46fz2\") pod \"dnsmasq-dns-865f5d856f-8pvjj\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.684380 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.724248 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.752906 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lt8qn"] Dec 05 06:14:14 crc kubenswrapper[4865]: I1205 06:14:14.797668 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.267462 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.335019 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.399987 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.516472 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-skfnv"] Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.529180 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.538125 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.538631 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.561138 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-skfnv"] Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.642065 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8pvjj"] Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.694215 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.694494 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-config-data\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.694613 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-scripts\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.694719 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfck\" (UniqueName: \"kubernetes.io/projected/479f47b5-b756-41b2-af12-ca6fcbe867a3-kube-api-access-ctfck\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.728266 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lt8qn" event={"ID":"67260032-55e2-4709-84b1-577259ffa891","Type":"ContainerStarted","Data":"ba7bfaeeb4e0660eb347f24521eb2c636d6843df8db40fff9d365c3fe42dbabb"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.728312 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lt8qn" event={"ID":"67260032-55e2-4709-84b1-577259ffa891","Type":"ContainerStarted","Data":"5f6d0f71400b9250318e9280a42d64dc439f250327abda57f8d4ef066bee509e"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.735609 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" event={"ID":"929b303b-d676-4548-9186-c29a7921cb8d","Type":"ContainerStarted","Data":"d393df4122c18ee5899ed37f019fe32ae7959b9a62bcf51dbeadc32639c5e1a2"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.738430 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce9f6f7e-e815-4cad-a620-08d913f360ec","Type":"ContainerStarted","Data":"7010e373d1bbbba3bc84882d23aff7df2bb5e383930ef4f310ae6a9a58b2dfe3"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.740117 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4e9b57b4-0806-43ad-8cb6-881ee1854ab5","Type":"ContainerStarted","Data":"ef90a388eeab531b534d0a622ebad3894ddd76014ab986310d9ab47522486e05"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.741831 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c81d161-71fa-4b7d-b386-51c0eb914cb4","Type":"ContainerStarted","Data":"7f1cc30fc1b698b1a21040825db41cf693150268e36da5eb1139e664d1b285b4"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.742848 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bbf49c8-107b-45e3-9d7f-a3203023f2bb","Type":"ContainerStarted","Data":"94f464fcd2a8f397406401ff7e3a87a5d29152a5a4db6f3b252ea7a95c1e0185"} Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.796343 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.796433 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-config-data\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.796461 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-scripts\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.796486 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfck\" (UniqueName: \"kubernetes.io/projected/479f47b5-b756-41b2-af12-ca6fcbe867a3-kube-api-access-ctfck\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.804552 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-scripts\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.805165 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.807291 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-config-data\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.814076 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfck\" (UniqueName: \"kubernetes.io/projected/479f47b5-b756-41b2-af12-ca6fcbe867a3-kube-api-access-ctfck\") pod \"nova-cell1-conductor-db-sync-skfnv\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.863862 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.913518 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.913653 4865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.916670 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 06:14:15 crc kubenswrapper[4865]: I1205 06:14:15.938834 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lt8qn" podStartSLOduration=2.938803826 podStartE2EDuration="2.938803826s" podCreationTimestamp="2025-12-05 06:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:15.757280719 +0000 UTC m=+1275.037291941" watchObservedRunningTime="2025-12-05 06:14:15.938803826 +0000 UTC m=+1275.218815048" Dec 05 06:14:16 crc kubenswrapper[4865]: I1205 06:14:16.354322 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-skfnv"] Dec 05 06:14:16 crc kubenswrapper[4865]: I1205 06:14:16.779571 4865 generic.go:334] "Generic (PLEG): container finished" podID="929b303b-d676-4548-9186-c29a7921cb8d" containerID="5fd9046a7115f6ced3a8f3cbf4acd158d79abdc939f7080b26a5239176a3489b" exitCode=0 Dec 05 06:14:16 crc kubenswrapper[4865]: I1205 06:14:16.779928 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" event={"ID":"929b303b-d676-4548-9186-c29a7921cb8d","Type":"ContainerDied","Data":"5fd9046a7115f6ced3a8f3cbf4acd158d79abdc939f7080b26a5239176a3489b"} Dec 05 06:14:16 crc kubenswrapper[4865]: I1205 06:14:16.795056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-skfnv" event={"ID":"479f47b5-b756-41b2-af12-ca6fcbe867a3","Type":"ContainerStarted","Data":"a8ec670a519c895770256b56e62232ad7cd36b7e39e166cdefeec80fa4470e4d"} Dec 05 06:14:16 crc kubenswrapper[4865]: I1205 06:14:16.795118 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-skfnv" event={"ID":"479f47b5-b756-41b2-af12-ca6fcbe867a3","Type":"ContainerStarted","Data":"4ab84d8da7d0f1b77d44a3b985b34509875777cc3cc96706ee63e7b168dbf2be"} Dec 05 06:14:16 crc kubenswrapper[4865]: I1205 06:14:16.831295 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-skfnv" podStartSLOduration=1.8312699700000001 podStartE2EDuration="1.83126997s" podCreationTimestamp="2025-12-05 06:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:16.823705045 +0000 UTC m=+1276.103716267" watchObservedRunningTime="2025-12-05 06:14:16.83126997 +0000 UTC m=+1276.111281182" Dec 05 06:14:17 crc kubenswrapper[4865]: I1205 06:14:17.744200 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:17 crc kubenswrapper[4865]: I1205 06:14:17.754464 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:19 crc kubenswrapper[4865]: I1205 06:14:19.832676 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" event={"ID":"929b303b-d676-4548-9186-c29a7921cb8d","Type":"ContainerStarted","Data":"cc4f42d998e4e3ff72e66cf93d52231d1d6927ca57713c318baaaebdbed7af19"} Dec 05 06:14:19 crc kubenswrapper[4865]: I1205 06:14:19.856459 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" podStartSLOduration=5.856437251 podStartE2EDuration="5.856437251s" podCreationTimestamp="2025-12-05 06:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:19.851080059 +0000 UTC m=+1279.131091281" watchObservedRunningTime="2025-12-05 06:14:19.856437251 +0000 UTC m=+1279.136448473" Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.853389 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4e9b57b4-0806-43ad-8cb6-881ee1854ab5","Type":"ContainerStarted","Data":"a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281"} Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.853537 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4e9b57b4-0806-43ad-8cb6-881ee1854ab5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281" gracePeriod=30 Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.863403 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c81d161-71fa-4b7d-b386-51c0eb914cb4","Type":"ContainerStarted","Data":"63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de"} Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.863445 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c81d161-71fa-4b7d-b386-51c0eb914cb4","Type":"ContainerStarted","Data":"ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc"} Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.866916 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bbf49c8-107b-45e3-9d7f-a3203023f2bb","Type":"ContainerStarted","Data":"db97bf61fafbe7667ef16e258f977c1d01452157666988170700836e61160401"} Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.866953 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bbf49c8-107b-45e3-9d7f-a3203023f2bb","Type":"ContainerStarted","Data":"f122e3eb8ad7d86eccb265103f75054a4ebe9a344a2dd0b5d7b3f2a8e85b0083"} Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.867063 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-log" containerID="cri-o://f122e3eb8ad7d86eccb265103f75054a4ebe9a344a2dd0b5d7b3f2a8e85b0083" gracePeriod=30 Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.867353 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-metadata" containerID="cri-o://db97bf61fafbe7667ef16e258f977c1d01452157666988170700836e61160401" gracePeriod=30 Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.876238 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.276463398 podStartE2EDuration="7.876200289s" podCreationTimestamp="2025-12-05 06:14:13 +0000 UTC" firstStartedPulling="2025-12-05 06:14:15.384034825 +0000 UTC m=+1274.664046047" lastFinishedPulling="2025-12-05 06:14:19.983771716 +0000 UTC m=+1279.263782938" observedRunningTime="2025-12-05 06:14:20.87272494 +0000 UTC m=+1280.152736162" watchObservedRunningTime="2025-12-05 06:14:20.876200289 +0000 UTC m=+1280.156211511" Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.882958 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce9f6f7e-e815-4cad-a620-08d913f360ec","Type":"ContainerStarted","Data":"3ec1846e07c1ddff1db5e75802c79654fc6b3f26b92251e6679a56f90186933e"} Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.883015 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:20 crc kubenswrapper[4865]: I1205 06:14:20.961258 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.408820035 podStartE2EDuration="7.961235209s" podCreationTimestamp="2025-12-05 06:14:13 +0000 UTC" firstStartedPulling="2025-12-05 06:14:15.441737277 +0000 UTC m=+1274.721748499" lastFinishedPulling="2025-12-05 06:14:19.994152451 +0000 UTC m=+1279.274163673" observedRunningTime="2025-12-05 06:14:20.955929538 +0000 UTC m=+1280.235940780" watchObservedRunningTime="2025-12-05 06:14:20.961235209 +0000 UTC m=+1280.241246431" Dec 05 06:14:21 crc kubenswrapper[4865]: I1205 06:14:21.049801 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.362357893 podStartE2EDuration="8.0497739s" podCreationTimestamp="2025-12-05 06:14:13 +0000 UTC" firstStartedPulling="2025-12-05 06:14:15.30271318 +0000 UTC m=+1274.582724402" lastFinishedPulling="2025-12-05 06:14:19.990129187 +0000 UTC m=+1279.270140409" observedRunningTime="2025-12-05 06:14:21.013384434 +0000 UTC m=+1280.293395656" watchObservedRunningTime="2025-12-05 06:14:21.0497739 +0000 UTC m=+1280.329785122" Dec 05 06:14:21 crc kubenswrapper[4865]: I1205 06:14:21.084187 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.890273215 podStartE2EDuration="8.084141828s" podCreationTimestamp="2025-12-05 06:14:13 +0000 UTC" firstStartedPulling="2025-12-05 06:14:14.797439837 +0000 UTC m=+1274.077451049" lastFinishedPulling="2025-12-05 06:14:19.99130844 +0000 UTC m=+1279.271319662" observedRunningTime="2025-12-05 06:14:21.053282039 +0000 UTC m=+1280.333293281" watchObservedRunningTime="2025-12-05 06:14:21.084141828 +0000 UTC m=+1280.364153050" Dec 05 06:14:21 crc kubenswrapper[4865]: I1205 06:14:21.161382 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:14:21 crc kubenswrapper[4865]: I1205 06:14:21.896217 4865 generic.go:334] "Generic (PLEG): container finished" podID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerID="f122e3eb8ad7d86eccb265103f75054a4ebe9a344a2dd0b5d7b3f2a8e85b0083" exitCode=143 Dec 05 06:14:21 crc kubenswrapper[4865]: I1205 06:14:21.896304 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bbf49c8-107b-45e3-9d7f-a3203023f2bb","Type":"ContainerDied","Data":"f122e3eb8ad7d86eccb265103f75054a4ebe9a344a2dd0b5d7b3f2a8e85b0083"} Dec 05 06:14:22 crc kubenswrapper[4865]: E1205 06:14:22.672473 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca38ca20_0d35_4058_b0f6_bbe4251c6aab.slice/crio-conmon-ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca38ca20_0d35_4058_b0f6_bbe4251c6aab.slice/crio-ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdda782_13a7_4c36_a8f3_7b1d09fd2ca1.slice/crio-conmon-6870896ac2ef3a4699b002484f042cdd44757cdd6a9115f2c877d993d650e74e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdda782_13a7_4c36_a8f3_7b1d09fd2ca1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdda782_13a7_4c36_a8f3_7b1d09fd2ca1.slice/crio-6870896ac2ef3a4699b002484f042cdd44757cdd6a9115f2c877d993d650e74e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdda782_13a7_4c36_a8f3_7b1d09fd2ca1.slice/crio-89ed7800bf7ae2c8e5678e239949816f78c03341fe289db77ff4d76b30f12fed\": RecentStats: unable to find data in memory cache]" Dec 05 06:14:22 crc kubenswrapper[4865]: I1205 06:14:22.913270 4865 generic.go:334] "Generic (PLEG): container finished" podID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerID="bf5eb3c540bf6f4d1c2cf3369e46010a3c5b8373292bec7ce126cccaca6796e4" exitCode=137 Dec 05 06:14:22 crc kubenswrapper[4865]: I1205 06:14:22.913378 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerDied","Data":"bf5eb3c540bf6f4d1c2cf3369e46010a3c5b8373292bec7ce126cccaca6796e4"} Dec 05 06:14:22 crc kubenswrapper[4865]: I1205 06:14:22.926199 4865 generic.go:334] "Generic (PLEG): container finished" podID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerID="db97bf61fafbe7667ef16e258f977c1d01452157666988170700836e61160401" exitCode=0 Dec 05 06:14:22 crc kubenswrapper[4865]: I1205 06:14:22.926264 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bbf49c8-107b-45e3-9d7f-a3203023f2bb","Type":"ContainerDied","Data":"db97bf61fafbe7667ef16e258f977c1d01452157666988170700836e61160401"} Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.406904 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.412910 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.492955 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-logs\") pod \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493070 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-log-httpd\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493122 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-sg-core-conf-yaml\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493206 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-scripts\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493253 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-combined-ca-bundle\") pod \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493340 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7lts\" (UniqueName: \"kubernetes.io/projected/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-kube-api-access-t7lts\") pod \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493385 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-combined-ca-bundle\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493435 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-config-data\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493453 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-run-httpd\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493540 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-config-data\") pod \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\" (UID: \"0bbf49c8-107b-45e3-9d7f-a3203023f2bb\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493678 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-logs" (OuterVolumeSpecName: "logs") pod "0bbf49c8-107b-45e3-9d7f-a3203023f2bb" (UID: "0bbf49c8-107b-45e3-9d7f-a3203023f2bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.493714 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htm89\" (UniqueName: \"kubernetes.io/projected/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-kube-api-access-htm89\") pod \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\" (UID: \"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0\") " Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.494819 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.495207 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.495222 4865 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.504743 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.516669 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-scripts" (OuterVolumeSpecName: "scripts") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.537210 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-kube-api-access-t7lts" (OuterVolumeSpecName: "kube-api-access-t7lts") pod "0bbf49c8-107b-45e3-9d7f-a3203023f2bb" (UID: "0bbf49c8-107b-45e3-9d7f-a3203023f2bb"). InnerVolumeSpecName "kube-api-access-t7lts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.537335 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-kube-api-access-htm89" (OuterVolumeSpecName: "kube-api-access-htm89") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "kube-api-access-htm89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.562227 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bbf49c8-107b-45e3-9d7f-a3203023f2bb" (UID: "0bbf49c8-107b-45e3-9d7f-a3203023f2bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.564222 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-config-data" (OuterVolumeSpecName: "config-data") pod "0bbf49c8-107b-45e3-9d7f-a3203023f2bb" (UID: "0bbf49c8-107b-45e3-9d7f-a3203023f2bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.572525 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597258 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htm89\" (UniqueName: \"kubernetes.io/projected/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-kube-api-access-htm89\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597302 4865 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597315 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597327 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597340 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7lts\" (UniqueName: \"kubernetes.io/projected/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-kube-api-access-t7lts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597351 4865 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.597363 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbf49c8-107b-45e3-9d7f-a3203023f2bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.660448 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.683685 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-config-data" (OuterVolumeSpecName: "config-data") pod "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" (UID: "d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.698978 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.699020 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.935166 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bbf49c8-107b-45e3-9d7f-a3203023f2bb","Type":"ContainerDied","Data":"94f464fcd2a8f397406401ff7e3a87a5d29152a5a4db6f3b252ea7a95c1e0185"} Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.935648 4865 scope.go:117] "RemoveContainer" containerID="db97bf61fafbe7667ef16e258f977c1d01452157666988170700836e61160401" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.935972 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.938269 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0","Type":"ContainerDied","Data":"cf1359be45a9d6d7b062f5c4d3877be46e051cd13d5dc4ad86346d95102ad6c5"} Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.938360 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.972611 4865 scope.go:117] "RemoveContainer" containerID="f122e3eb8ad7d86eccb265103f75054a4ebe9a344a2dd0b5d7b3f2a8e85b0083" Dec 05 06:14:23 crc kubenswrapper[4865]: I1205 06:14:23.998368 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.030182 4865 scope.go:117] "RemoveContainer" containerID="bf5eb3c540bf6f4d1c2cf3369e46010a3c5b8373292bec7ce126cccaca6796e4" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.080876 4865 scope.go:117] "RemoveContainer" containerID="f6fe61521ad90a16156e49ed8286e05af7595598d5d2cc24e1be23f06621ac62" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.100443 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.127204 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.127242 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.148667 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: E1205 06:14:24.151110 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-notification-agent" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151138 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-notification-agent" Dec 05 06:14:24 crc kubenswrapper[4865]: E1205 06:14:24.151157 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-metadata" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151165 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-metadata" Dec 05 06:14:24 crc kubenswrapper[4865]: E1205 06:14:24.151201 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="sg-core" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151207 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="sg-core" Dec 05 06:14:24 crc kubenswrapper[4865]: E1205 06:14:24.151224 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="proxy-httpd" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151230 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="proxy-httpd" Dec 05 06:14:24 crc kubenswrapper[4865]: E1205 06:14:24.151254 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-central-agent" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151262 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-central-agent" Dec 05 06:14:24 crc kubenswrapper[4865]: E1205 06:14:24.151286 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-log" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151292 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-log" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151854 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-central-agent" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151891 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="ceilometer-notification-agent" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151908 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-metadata" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151925 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" containerName="nova-metadata-log" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151941 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="proxy-httpd" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.151955 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" containerName="sg-core" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.155197 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.160291 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.160642 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.196788 4865 scope.go:117] "RemoveContainer" containerID="4626a8d95506322cf22624e542215549495418a13ace2b6e864aa9dd319b2b1a" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.209737 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.211489 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.246571 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.256002 4865 scope.go:117] "RemoveContainer" containerID="a854569bfe5f64034de0c2f46c6f057011a89690fd27a4e079b96c9689098f19" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.267149 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.294130 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.298329 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.302955 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xkvp\" (UniqueName: \"kubernetes.io/projected/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-kube-api-access-5xkvp\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303038 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkbf\" (UniqueName: \"kubernetes.io/projected/f7834dad-27c7-4da7-9e87-1d5196269fe4-kube-api-access-drkbf\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303077 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-config-data\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303095 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303284 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303372 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303417 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-log-httpd\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303482 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-logs\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.303510 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-run-httpd\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.304756 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-config-data\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.304806 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.304883 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.305126 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-scripts\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.305187 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.333384 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.385045 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.385091 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.406982 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-logs\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407041 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-run-httpd\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407108 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-config-data\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407143 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407188 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407215 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-scripts\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407261 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407295 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xkvp\" (UniqueName: \"kubernetes.io/projected/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-kube-api-access-5xkvp\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407327 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkbf\" (UniqueName: \"kubernetes.io/projected/f7834dad-27c7-4da7-9e87-1d5196269fe4-kube-api-access-drkbf\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407353 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-config-data\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407373 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.407434 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-log-httpd\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.408219 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-logs\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.408518 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-run-httpd\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.413863 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-config-data\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.414239 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-log-httpd\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.417037 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.417461 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.417481 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.418288 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-scripts\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.419617 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-config-data\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.428482 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xkvp\" (UniqueName: \"kubernetes.io/projected/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-kube-api-access-5xkvp\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.428702 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.436633 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkbf\" (UniqueName: \"kubernetes.io/projected/f7834dad-27c7-4da7-9e87-1d5196269fe4-kube-api-access-drkbf\") pod \"ceilometer-0\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.460359 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.498274 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.649227 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.687056 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.811535 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-cc4q4"] Dec 05 06:14:24 crc kubenswrapper[4865]: I1205 06:14:24.811859 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="dnsmasq-dns" containerID="cri-o://733f94fcf8a137a64516bce1be36da4adea848cac537b1c468db642ae012d9f3" gracePeriod=10 Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.048213 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbf49c8-107b-45e3-9d7f-a3203023f2bb" path="/var/lib/kubelet/pods/0bbf49c8-107b-45e3-9d7f-a3203023f2bb/volumes" Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.050600 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0" path="/var/lib/kubelet/pods/d8980941-4bc7-4ea0-a7cf-e6e3a03d02f0/volumes" Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.078689 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.474121 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.474083 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.542449 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.613943 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: connect: connection refused" Dec 05 06:14:25 crc kubenswrapper[4865]: I1205 06:14:25.628188 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.032633 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2","Type":"ContainerStarted","Data":"d0a8e32bda81c550896acc14b7b7068afb3bb46562badcc1a1b3dcfe60aeb4b0"} Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.081581 4865 generic.go:334] "Generic (PLEG): container finished" podID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerID="733f94fcf8a137a64516bce1be36da4adea848cac537b1c468db642ae012d9f3" exitCode=0 Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.082557 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" event={"ID":"b934cf19-1e79-4b97-bf09-8af1cb89d6d5","Type":"ContainerDied","Data":"733f94fcf8a137a64516bce1be36da4adea848cac537b1c468db642ae012d9f3"} Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.098533 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerStarted","Data":"90e38e3e66851ebf348411d45a965412324644051dc86dc780020ffd27b3860a"} Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.188228 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.261524 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hftz2\" (UniqueName: \"kubernetes.io/projected/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-kube-api-access-hftz2\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.261709 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.261765 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-config\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.261779 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-swift-storage-0\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.261812 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-sb\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.261893 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-nb\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.292079 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-kube-api-access-hftz2" (OuterVolumeSpecName: "kube-api-access-hftz2") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "kube-api-access-hftz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.351752 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.363751 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.364048 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hftz2\" (UniqueName: \"kubernetes.io/projected/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-kube-api-access-hftz2\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.369216 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-config" (OuterVolumeSpecName: "config") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.430848 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.466525 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.466672 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc\") pod \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\" (UID: \"b934cf19-1e79-4b97-bf09-8af1cb89d6d5\") " Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.467471 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.467492 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:26 crc kubenswrapper[4865]: W1205 06:14:26.467600 4865 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b934cf19-1e79-4b97-bf09-8af1cb89d6d5/volumes/kubernetes.io~configmap/dns-svc Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.467614 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.550276 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b934cf19-1e79-4b97-bf09-8af1cb89d6d5" (UID: "b934cf19-1e79-4b97-bf09-8af1cb89d6d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.568832 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:26 crc kubenswrapper[4865]: I1205 06:14:26.568866 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b934cf19-1e79-4b97-bf09-8af1cb89d6d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.107430 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerStarted","Data":"26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a"} Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.108570 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2","Type":"ContainerStarted","Data":"d31955fb8cddbf74252ee7082a0006522693c5add1b465a7006c1eb052e70c23"} Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.109757 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" event={"ID":"b934cf19-1e79-4b97-bf09-8af1cb89d6d5","Type":"ContainerDied","Data":"12b553a7dfb4ab11be5b3ebc373fa25160aec99f93ea307901255ce58c9382fb"} Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.109792 4865 scope.go:117] "RemoveContainer" containerID="733f94fcf8a137a64516bce1be36da4adea848cac537b1c468db642ae012d9f3" Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.110912 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-cc4q4" Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.146565 4865 scope.go:117] "RemoveContainer" containerID="652c47eb401454a63b9cd27031a6b25376cf01e022683cb7909e7aaeab604c2f" Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.168441 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-cc4q4"] Dec 05 06:14:27 crc kubenswrapper[4865]: I1205 06:14:27.180912 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-cc4q4"] Dec 05 06:14:28 crc kubenswrapper[4865]: I1205 06:14:28.123652 4865 generic.go:334] "Generic (PLEG): container finished" podID="67260032-55e2-4709-84b1-577259ffa891" containerID="ba7bfaeeb4e0660eb347f24521eb2c636d6843df8db40fff9d365c3fe42dbabb" exitCode=0 Dec 05 06:14:28 crc kubenswrapper[4865]: I1205 06:14:28.123747 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lt8qn" event={"ID":"67260032-55e2-4709-84b1-577259ffa891","Type":"ContainerDied","Data":"ba7bfaeeb4e0660eb347f24521eb2c636d6843df8db40fff9d365c3fe42dbabb"} Dec 05 06:14:28 crc kubenswrapper[4865]: I1205 06:14:28.128650 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerStarted","Data":"2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5"} Dec 05 06:14:28 crc kubenswrapper[4865]: I1205 06:14:28.131841 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2","Type":"ContainerStarted","Data":"9b7741d142fdf055c60f405cfeb253a5d6b508499e5299a02d218346309c287b"} Dec 05 06:14:28 crc kubenswrapper[4865]: I1205 06:14:28.173497 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.173475915 podStartE2EDuration="5.173475915s" podCreationTimestamp="2025-12-05 06:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:28.169362228 +0000 UTC m=+1287.449373450" watchObservedRunningTime="2025-12-05 06:14:28.173475915 +0000 UTC m=+1287.453487137" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.020203 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" path="/var/lib/kubelet/pods/b934cf19-1e79-4b97-bf09-8af1cb89d6d5/volumes" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.152486 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerStarted","Data":"c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87"} Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.498926 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.499914 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.773141 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.836991 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-scripts\") pod \"67260032-55e2-4709-84b1-577259ffa891\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.837091 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-combined-ca-bundle\") pod \"67260032-55e2-4709-84b1-577259ffa891\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.837532 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data\") pod \"67260032-55e2-4709-84b1-577259ffa891\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.837611 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f7bz\" (UniqueName: \"kubernetes.io/projected/67260032-55e2-4709-84b1-577259ffa891-kube-api-access-6f7bz\") pod \"67260032-55e2-4709-84b1-577259ffa891\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.843428 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67260032-55e2-4709-84b1-577259ffa891-kube-api-access-6f7bz" (OuterVolumeSpecName: "kube-api-access-6f7bz") pod "67260032-55e2-4709-84b1-577259ffa891" (UID: "67260032-55e2-4709-84b1-577259ffa891"). InnerVolumeSpecName "kube-api-access-6f7bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.870338 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-scripts" (OuterVolumeSpecName: "scripts") pod "67260032-55e2-4709-84b1-577259ffa891" (UID: "67260032-55e2-4709-84b1-577259ffa891"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:29 crc kubenswrapper[4865]: E1205 06:14:29.896567 4865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data podName:67260032-55e2-4709-84b1-577259ffa891 nodeName:}" failed. No retries permitted until 2025-12-05 06:14:30.396545721 +0000 UTC m=+1289.676556953 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data") pod "67260032-55e2-4709-84b1-577259ffa891" (UID: "67260032-55e2-4709-84b1-577259ffa891") : error deleting /var/lib/kubelet/pods/67260032-55e2-4709-84b1-577259ffa891/volume-subpaths: remove /var/lib/kubelet/pods/67260032-55e2-4709-84b1-577259ffa891/volume-subpaths: no such file or directory Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.906566 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67260032-55e2-4709-84b1-577259ffa891" (UID: "67260032-55e2-4709-84b1-577259ffa891"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.940535 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f7bz\" (UniqueName: \"kubernetes.io/projected/67260032-55e2-4709-84b1-577259ffa891-kube-api-access-6f7bz\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.940614 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:29 crc kubenswrapper[4865]: I1205 06:14:29.940642 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.206360 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lt8qn" event={"ID":"67260032-55e2-4709-84b1-577259ffa891","Type":"ContainerDied","Data":"5f6d0f71400b9250318e9280a42d64dc439f250327abda57f8d4ef066bee509e"} Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.206683 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f6d0f71400b9250318e9280a42d64dc439f250327abda57f8d4ef066bee509e" Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.206626 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lt8qn" Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.210242 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerStarted","Data":"d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f"} Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.210348 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.213171 4865 generic.go:334] "Generic (PLEG): container finished" podID="479f47b5-b756-41b2-af12-ca6fcbe867a3" containerID="a8ec670a519c895770256b56e62232ad7cd36b7e39e166cdefeec80fa4470e4d" exitCode=0 Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.213757 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-skfnv" event={"ID":"479f47b5-b756-41b2-af12-ca6fcbe867a3","Type":"ContainerDied","Data":"a8ec670a519c895770256b56e62232ad7cd36b7e39e166cdefeec80fa4470e4d"} Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.236494 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.251491665 podStartE2EDuration="6.236467727s" podCreationTimestamp="2025-12-05 06:14:24 +0000 UTC" firstStartedPulling="2025-12-05 06:14:25.614676658 +0000 UTC m=+1284.894687880" lastFinishedPulling="2025-12-05 06:14:29.59965272 +0000 UTC m=+1288.879663942" observedRunningTime="2025-12-05 06:14:30.231422723 +0000 UTC m=+1289.511433945" watchObservedRunningTime="2025-12-05 06:14:30.236467727 +0000 UTC m=+1289.516478949" Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.329805 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.330088 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-log" containerID="cri-o://ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc" gracePeriod=30 Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.330157 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-api" containerID="cri-o://63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de" gracePeriod=30 Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.344859 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.345343 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ce9f6f7e-e815-4cad-a620-08d913f360ec" containerName="nova-scheduler-scheduler" containerID="cri-o://3ec1846e07c1ddff1db5e75802c79654fc6b3f26b92251e6679a56f90186933e" gracePeriod=30 Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.395320 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.448634 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data\") pod \"67260032-55e2-4709-84b1-577259ffa891\" (UID: \"67260032-55e2-4709-84b1-577259ffa891\") " Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.456004 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data" (OuterVolumeSpecName: "config-data") pod "67260032-55e2-4709-84b1-577259ffa891" (UID: "67260032-55e2-4709-84b1-577259ffa891"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:30 crc kubenswrapper[4865]: I1205 06:14:30.550897 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67260032-55e2-4709-84b1-577259ffa891-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.156804 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bd68dd9b8-z62zt" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.157221 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.225902 4865 generic.go:334] "Generic (PLEG): container finished" podID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerID="ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc" exitCode=143 Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.225879 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c81d161-71fa-4b7d-b386-51c0eb914cb4","Type":"ContainerDied","Data":"ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc"} Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.672161 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.773766 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctfck\" (UniqueName: \"kubernetes.io/projected/479f47b5-b756-41b2-af12-ca6fcbe867a3-kube-api-access-ctfck\") pod \"479f47b5-b756-41b2-af12-ca6fcbe867a3\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.773854 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-combined-ca-bundle\") pod \"479f47b5-b756-41b2-af12-ca6fcbe867a3\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.773935 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-scripts\") pod \"479f47b5-b756-41b2-af12-ca6fcbe867a3\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.774126 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-config-data\") pod \"479f47b5-b756-41b2-af12-ca6fcbe867a3\" (UID: \"479f47b5-b756-41b2-af12-ca6fcbe867a3\") " Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.787564 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479f47b5-b756-41b2-af12-ca6fcbe867a3-kube-api-access-ctfck" (OuterVolumeSpecName: "kube-api-access-ctfck") pod "479f47b5-b756-41b2-af12-ca6fcbe867a3" (UID: "479f47b5-b756-41b2-af12-ca6fcbe867a3"). InnerVolumeSpecName "kube-api-access-ctfck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.794009 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-scripts" (OuterVolumeSpecName: "scripts") pod "479f47b5-b756-41b2-af12-ca6fcbe867a3" (UID: "479f47b5-b756-41b2-af12-ca6fcbe867a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.803438 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-config-data" (OuterVolumeSpecName: "config-data") pod "479f47b5-b756-41b2-af12-ca6fcbe867a3" (UID: "479f47b5-b756-41b2-af12-ca6fcbe867a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.815530 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "479f47b5-b756-41b2-af12-ca6fcbe867a3" (UID: "479f47b5-b756-41b2-af12-ca6fcbe867a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.876631 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.876666 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctfck\" (UniqueName: \"kubernetes.io/projected/479f47b5-b756-41b2-af12-ca6fcbe867a3-kube-api-access-ctfck\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.876677 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:31 crc kubenswrapper[4865]: I1205 06:14:31.876686 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479f47b5-b756-41b2-af12-ca6fcbe867a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.273348 4865 generic.go:334] "Generic (PLEG): container finished" podID="ce9f6f7e-e815-4cad-a620-08d913f360ec" containerID="3ec1846e07c1ddff1db5e75802c79654fc6b3f26b92251e6679a56f90186933e" exitCode=0 Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.273432 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce9f6f7e-e815-4cad-a620-08d913f360ec","Type":"ContainerDied","Data":"3ec1846e07c1ddff1db5e75802c79654fc6b3f26b92251e6679a56f90186933e"} Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.300754 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-log" containerID="cri-o://d31955fb8cddbf74252ee7082a0006522693c5add1b465a7006c1eb052e70c23" gracePeriod=30 Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.301210 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-skfnv" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.312994 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-skfnv" event={"ID":"479f47b5-b756-41b2-af12-ca6fcbe867a3","Type":"ContainerDied","Data":"4ab84d8da7d0f1b77d44a3b985b34509875777cc3cc96706ee63e7b168dbf2be"} Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.313054 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ab84d8da7d0f1b77d44a3b985b34509875777cc3cc96706ee63e7b168dbf2be" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.313123 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-metadata" containerID="cri-o://9b7741d142fdf055c60f405cfeb253a5d6b508499e5299a02d218346309c287b" gracePeriod=30 Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.446387 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 06:14:32 crc kubenswrapper[4865]: E1205 06:14:32.453421 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67260032-55e2-4709-84b1-577259ffa891" containerName="nova-manage" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453462 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="67260032-55e2-4709-84b1-577259ffa891" containerName="nova-manage" Dec 05 06:14:32 crc kubenswrapper[4865]: E1205 06:14:32.453479 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="dnsmasq-dns" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453485 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="dnsmasq-dns" Dec 05 06:14:32 crc kubenswrapper[4865]: E1205 06:14:32.453510 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="init" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453542 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="init" Dec 05 06:14:32 crc kubenswrapper[4865]: E1205 06:14:32.453555 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479f47b5-b756-41b2-af12-ca6fcbe867a3" containerName="nova-cell1-conductor-db-sync" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453561 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="479f47b5-b756-41b2-af12-ca6fcbe867a3" containerName="nova-cell1-conductor-db-sync" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453858 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b934cf19-1e79-4b97-bf09-8af1cb89d6d5" containerName="dnsmasq-dns" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453892 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="479f47b5-b756-41b2-af12-ca6fcbe867a3" containerName="nova-cell1-conductor-db-sync" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.453904 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="67260032-55e2-4709-84b1-577259ffa891" containerName="nova-manage" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.454603 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.468286 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.476029 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.539834 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.539945 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.539989 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mm2\" (UniqueName: \"kubernetes.io/projected/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-kube-api-access-b4mm2\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.646969 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.647053 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.647075 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mm2\" (UniqueName: \"kubernetes.io/projected/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-kube-api-access-b4mm2\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.662150 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.667944 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.686761 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mm2\" (UniqueName: \"kubernetes.io/projected/2587c341-67da-4cfc-a5fc-44d3eeefa9a4-kube-api-access-b4mm2\") pod \"nova-cell1-conductor-0\" (UID: \"2587c341-67da-4cfc-a5fc-44d3eeefa9a4\") " pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.809953 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.816158 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.952524 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls6bv\" (UniqueName: \"kubernetes.io/projected/ce9f6f7e-e815-4cad-a620-08d913f360ec-kube-api-access-ls6bv\") pod \"ce9f6f7e-e815-4cad-a620-08d913f360ec\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.952584 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-combined-ca-bundle\") pod \"ce9f6f7e-e815-4cad-a620-08d913f360ec\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.952732 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-config-data\") pod \"ce9f6f7e-e815-4cad-a620-08d913f360ec\" (UID: \"ce9f6f7e-e815-4cad-a620-08d913f360ec\") " Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.959091 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9f6f7e-e815-4cad-a620-08d913f360ec-kube-api-access-ls6bv" (OuterVolumeSpecName: "kube-api-access-ls6bv") pod "ce9f6f7e-e815-4cad-a620-08d913f360ec" (UID: "ce9f6f7e-e815-4cad-a620-08d913f360ec"). InnerVolumeSpecName "kube-api-access-ls6bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:32 crc kubenswrapper[4865]: I1205 06:14:32.987876 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-config-data" (OuterVolumeSpecName: "config-data") pod "ce9f6f7e-e815-4cad-a620-08d913f360ec" (UID: "ce9f6f7e-e815-4cad-a620-08d913f360ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.025265 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce9f6f7e-e815-4cad-a620-08d913f360ec" (UID: "ce9f6f7e-e815-4cad-a620-08d913f360ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.054895 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.054924 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls6bv\" (UniqueName: \"kubernetes.io/projected/ce9f6f7e-e815-4cad-a620-08d913f360ec-kube-api-access-ls6bv\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.054936 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9f6f7e-e815-4cad-a620-08d913f360ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.317983 4865 generic.go:334] "Generic (PLEG): container finished" podID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerID="9b7741d142fdf055c60f405cfeb253a5d6b508499e5299a02d218346309c287b" exitCode=0 Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.318016 4865 generic.go:334] "Generic (PLEG): container finished" podID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerID="d31955fb8cddbf74252ee7082a0006522693c5add1b465a7006c1eb052e70c23" exitCode=143 Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.318056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2","Type":"ContainerDied","Data":"9b7741d142fdf055c60f405cfeb253a5d6b508499e5299a02d218346309c287b"} Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.318083 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2","Type":"ContainerDied","Data":"d31955fb8cddbf74252ee7082a0006522693c5add1b465a7006c1eb052e70c23"} Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.319304 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce9f6f7e-e815-4cad-a620-08d913f360ec","Type":"ContainerDied","Data":"7010e373d1bbbba3bc84882d23aff7df2bb5e383930ef4f310ae6a9a58b2dfe3"} Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.319338 4865 scope.go:117] "RemoveContainer" containerID="3ec1846e07c1ddff1db5e75802c79654fc6b3f26b92251e6679a56f90186933e" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.319533 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.356725 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.396955 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.429054 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:33 crc kubenswrapper[4865]: E1205 06:14:33.429640 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9f6f7e-e815-4cad-a620-08d913f360ec" containerName="nova-scheduler-scheduler" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.429659 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9f6f7e-e815-4cad-a620-08d913f360ec" containerName="nova-scheduler-scheduler" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.429971 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9f6f7e-e815-4cad-a620-08d913f360ec" containerName="nova-scheduler-scheduler" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.430841 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.443384 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.457205 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.502433 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.565512 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-config-data\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.565886 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.566066 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhxd\" (UniqueName: \"kubernetes.io/projected/87766c6c-42b5-4851-9729-29f38ed36ae5-kube-api-access-chhxd\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.590326 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668030 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xkvp\" (UniqueName: \"kubernetes.io/projected/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-kube-api-access-5xkvp\") pod \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668156 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-nova-metadata-tls-certs\") pod \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668195 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-combined-ca-bundle\") pod \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668373 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-logs\") pod \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668429 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-config-data\") pod \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\" (UID: \"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2\") " Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668812 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668894 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chhxd\" (UniqueName: \"kubernetes.io/projected/87766c6c-42b5-4851-9729-29f38ed36ae5-kube-api-access-chhxd\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.668934 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-config-data\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.670030 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-logs" (OuterVolumeSpecName: "logs") pod "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" (UID: "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.675752 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.683285 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-kube-api-access-5xkvp" (OuterVolumeSpecName: "kube-api-access-5xkvp") pod "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" (UID: "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2"). InnerVolumeSpecName "kube-api-access-5xkvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.684753 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-config-data\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.687414 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhxd\" (UniqueName: \"kubernetes.io/projected/87766c6c-42b5-4851-9729-29f38ed36ae5-kube-api-access-chhxd\") pod \"nova-scheduler-0\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.734281 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-config-data" (OuterVolumeSpecName: "config-data") pod "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" (UID: "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.741646 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" (UID: "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.772059 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.772116 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.772127 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xkvp\" (UniqueName: \"kubernetes.io/projected/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-kube-api-access-5xkvp\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.772138 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.788481 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.791984 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" (UID: "ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:33 crc kubenswrapper[4865]: I1205 06:14:33.874657 4865 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.112759 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.191532 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t5vr\" (UniqueName: \"kubernetes.io/projected/4c81d161-71fa-4b7d-b386-51c0eb914cb4-kube-api-access-9t5vr\") pod \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.191593 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-config-data\") pod \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.191640 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c81d161-71fa-4b7d-b386-51c0eb914cb4-logs\") pod \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.191702 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-combined-ca-bundle\") pod \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\" (UID: \"4c81d161-71fa-4b7d-b386-51c0eb914cb4\") " Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.193268 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c81d161-71fa-4b7d-b386-51c0eb914cb4-logs" (OuterVolumeSpecName: "logs") pod "4c81d161-71fa-4b7d-b386-51c0eb914cb4" (UID: "4c81d161-71fa-4b7d-b386-51c0eb914cb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.239872 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c81d161-71fa-4b7d-b386-51c0eb914cb4-kube-api-access-9t5vr" (OuterVolumeSpecName: "kube-api-access-9t5vr") pod "4c81d161-71fa-4b7d-b386-51c0eb914cb4" (UID: "4c81d161-71fa-4b7d-b386-51c0eb914cb4"). InnerVolumeSpecName "kube-api-access-9t5vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.274916 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-config-data" (OuterVolumeSpecName: "config-data") pod "4c81d161-71fa-4b7d-b386-51c0eb914cb4" (UID: "4c81d161-71fa-4b7d-b386-51c0eb914cb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.276886 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c81d161-71fa-4b7d-b386-51c0eb914cb4" (UID: "4c81d161-71fa-4b7d-b386-51c0eb914cb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.294796 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.294843 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t5vr\" (UniqueName: \"kubernetes.io/projected/4c81d161-71fa-4b7d-b386-51c0eb914cb4-kube-api-access-9t5vr\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.294854 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c81d161-71fa-4b7d-b386-51c0eb914cb4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.294864 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c81d161-71fa-4b7d-b386-51c0eb914cb4-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.358010 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2587c341-67da-4cfc-a5fc-44d3eeefa9a4","Type":"ContainerStarted","Data":"5be20e851a921af07a1d019d4492e5d54d9f212e22b3adea1bd11d871c4b77d2"} Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.358368 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2587c341-67da-4cfc-a5fc-44d3eeefa9a4","Type":"ContainerStarted","Data":"4871d3a936efcf93bfdd1e076fdc0ade644c987cd785cc65a555cdb7ce7c6abe"} Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.358404 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.376548 4865 generic.go:334] "Generic (PLEG): container finished" podID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerID="63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de" exitCode=0 Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.376629 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c81d161-71fa-4b7d-b386-51c0eb914cb4","Type":"ContainerDied","Data":"63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de"} Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.376666 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c81d161-71fa-4b7d-b386-51c0eb914cb4","Type":"ContainerDied","Data":"7f1cc30fc1b698b1a21040825db41cf693150268e36da5eb1139e664d1b285b4"} Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.376687 4865 scope.go:117] "RemoveContainer" containerID="63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.376886 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.380557 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.380541347 podStartE2EDuration="2.380541347s" podCreationTimestamp="2025-12-05 06:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:34.379842158 +0000 UTC m=+1293.659853380" watchObservedRunningTime="2025-12-05 06:14:34.380541347 +0000 UTC m=+1293.660552559" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.398011 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2","Type":"ContainerDied","Data":"d0a8e32bda81c550896acc14b7b7068afb3bb46562badcc1a1b3dcfe60aeb4b0"} Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.398097 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.427044 4865 scope.go:117] "RemoveContainer" containerID="ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.443521 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.461084 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.494406 4865 scope.go:117] "RemoveContainer" containerID="63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.495185 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: E1205 06:14:34.495289 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de\": container with ID starting with 63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de not found: ID does not exist" containerID="63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.495315 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de"} err="failed to get container status \"63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de\": rpc error: code = NotFound desc = could not find container \"63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de\": container with ID starting with 63d7de5b1932f0adce37697171115e7b464af3a2630458868d9f71ebb24261de not found: ID does not exist" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.495336 4865 scope.go:117] "RemoveContainer" containerID="ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc" Dec 05 06:14:34 crc kubenswrapper[4865]: E1205 06:14:34.498369 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc\": container with ID starting with ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc not found: ID does not exist" containerID="ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.498408 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc"} err="failed to get container status \"ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc\": rpc error: code = NotFound desc = could not find container \"ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc\": container with ID starting with ed1fd4bf6a3c5ae267f278e43bc60e370c2d4bd2d5acaba3efaf95ace1f7c7dc not found: ID does not exist" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.498436 4865 scope.go:117] "RemoveContainer" containerID="9b7741d142fdf055c60f405cfeb253a5d6b508499e5299a02d218346309c287b" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.515893 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.533760 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.576891 4865 scope.go:117] "RemoveContainer" containerID="d31955fb8cddbf74252ee7082a0006522693c5add1b465a7006c1eb052e70c23" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.594300 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: E1205 06:14:34.595841 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-log" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.595862 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-log" Dec 05 06:14:34 crc kubenswrapper[4865]: E1205 06:14:34.595891 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-log" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.595898 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-log" Dec 05 06:14:34 crc kubenswrapper[4865]: E1205 06:14:34.595915 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-api" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.595921 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-api" Dec 05 06:14:34 crc kubenswrapper[4865]: E1205 06:14:34.595938 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-metadata" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.595944 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-metadata" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.596518 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-log" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.596563 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-api" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.596594 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" containerName="nova-api-log" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.596628 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" containerName="nova-metadata-metadata" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.602010 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.607798 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.612998 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.615379 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.623380 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.626274 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.626526 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.650067 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.706925 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-config-data\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.706995 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-config-data\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707017 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-logs\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707043 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707067 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-logs\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707172 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6xb\" (UniqueName: \"kubernetes.io/projected/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-kube-api-access-nk6xb\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707217 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707263 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4f4w\" (UniqueName: \"kubernetes.io/projected/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-kube-api-access-m4f4w\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.707305 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809325 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-config-data\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809422 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-config-data\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809450 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-logs\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809484 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809519 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-logs\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809587 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6xb\" (UniqueName: \"kubernetes.io/projected/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-kube-api-access-nk6xb\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809635 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809711 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4f4w\" (UniqueName: \"kubernetes.io/projected/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-kube-api-access-m4f4w\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.809757 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.810033 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-logs\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.810991 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-logs\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.814550 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-config-data\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.817279 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.817764 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.821244 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-config-data\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.825310 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.832326 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6xb\" (UniqueName: \"kubernetes.io/projected/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-kube-api-access-nk6xb\") pod \"nova-metadata-0\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " pod="openstack/nova-metadata-0" Dec 05 06:14:34 crc kubenswrapper[4865]: I1205 06:14:34.836180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4f4w\" (UniqueName: \"kubernetes.io/projected/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-kube-api-access-m4f4w\") pod \"nova-api-0\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " pod="openstack/nova-api-0" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.017158 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c81d161-71fa-4b7d-b386-51c0eb914cb4" path="/var/lib/kubelet/pods/4c81d161-71fa-4b7d-b386-51c0eb914cb4/volumes" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.017963 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9f6f7e-e815-4cad-a620-08d913f360ec" path="/var/lib/kubelet/pods/ce9f6f7e-e815-4cad-a620-08d913f360ec/volumes" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.018500 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2" path="/var/lib/kubelet/pods/ee1ec84b-8bc5-47d2-aba2-17d2bcba16b2/volumes" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.034226 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.047713 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.423103 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87766c6c-42b5-4851-9729-29f38ed36ae5","Type":"ContainerStarted","Data":"e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6"} Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.423363 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87766c6c-42b5-4851-9729-29f38ed36ae5","Type":"ContainerStarted","Data":"18f245630b70c3b5ac0b82c0456f032c5c7658e2dcd408d09fdf8733029dc0e6"} Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.454043 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.454018714 podStartE2EDuration="2.454018714s" podCreationTimestamp="2025-12-05 06:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:35.441129177 +0000 UTC m=+1294.721140399" watchObservedRunningTime="2025-12-05 06:14:35.454018714 +0000 UTC m=+1294.734029936" Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.582169 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:14:35 crc kubenswrapper[4865]: I1205 06:14:35.704055 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.440651 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cfe6c68c-e7ca-4000-92e1-47b23aaf3045","Type":"ContainerStarted","Data":"238336844db3246355c27c466ef2589189987c5f67815166ed3cce4c50bac7d4"} Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.441055 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cfe6c68c-e7ca-4000-92e1-47b23aaf3045","Type":"ContainerStarted","Data":"e33bacbf31fa71f0a371f9ad6ab55d7ab46bd527ce59043001854f1f0a0275ec"} Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.441072 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cfe6c68c-e7ca-4000-92e1-47b23aaf3045","Type":"ContainerStarted","Data":"f06e482d35d76d8c80b0b87d3c93e71bc0bd52201a5bf003e2bc02b92eeceffe"} Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.451610 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5025acca-f209-4f4c-ab4e-b7e386f5c3ab","Type":"ContainerStarted","Data":"dda00ade2362b19bae318cabdf3adaeb9bbb1de17d6321f885296bf1ceea047c"} Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.451650 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5025acca-f209-4f4c-ab4e-b7e386f5c3ab","Type":"ContainerStarted","Data":"7f1b9ce1734b8f703e12f26e1ba689eb26a81ec5bdc81bf08cc7045cf96572be"} Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.451664 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5025acca-f209-4f4c-ab4e-b7e386f5c3ab","Type":"ContainerStarted","Data":"686b65b7a3958c691b877b885459dc85861353f4c3031eba6e3f3efa6cb38f44"} Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.547747 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.547720725 podStartE2EDuration="2.547720725s" podCreationTimestamp="2025-12-05 06:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:36.543610628 +0000 UTC m=+1295.823621850" watchObservedRunningTime="2025-12-05 06:14:36.547720725 +0000 UTC m=+1295.827731947" Dec 05 06:14:36 crc kubenswrapper[4865]: I1205 06:14:36.549520 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.549506446 podStartE2EDuration="2.549506446s" podCreationTimestamp="2025-12-05 06:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:36.483327722 +0000 UTC m=+1295.763338944" watchObservedRunningTime="2025-12-05 06:14:36.549506446 +0000 UTC m=+1295.829517668" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.191702 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279506 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgw5w\" (UniqueName: \"kubernetes.io/projected/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-kube-api-access-vgw5w\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279605 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-logs\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279684 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-tls-certs\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279755 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-secret-key\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279789 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-scripts\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279884 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-combined-ca-bundle\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.279901 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-config-data\") pod \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\" (UID: \"ca38ca20-0d35-4058-b0f6-bbe4251c6aab\") " Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.283273 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-logs" (OuterVolumeSpecName: "logs") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.288296 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-kube-api-access-vgw5w" (OuterVolumeSpecName: "kube-api-access-vgw5w") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "kube-api-access-vgw5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.310173 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.351294 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.352396 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-config-data" (OuterVolumeSpecName: "config-data") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.377096 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-scripts" (OuterVolumeSpecName: "scripts") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.381949 4865 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.381982 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.381991 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.382000 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.382010 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgw5w\" (UniqueName: \"kubernetes.io/projected/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-kube-api-access-vgw5w\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.382018 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.393293 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ca38ca20-0d35-4058-b0f6-bbe4251c6aab" (UID: "ca38ca20-0d35-4058-b0f6-bbe4251c6aab"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.464340 4865 generic.go:334] "Generic (PLEG): container finished" podID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerID="ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f" exitCode=137 Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.465721 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerDied","Data":"ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f"} Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.465776 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd68dd9b8-z62zt" event={"ID":"ca38ca20-0d35-4058-b0f6-bbe4251c6aab","Type":"ContainerDied","Data":"6d21b4ca635b89c25b1312497c7c48661f26fb2870d0fee384c6f7f828540f6a"} Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.465797 4865 scope.go:117] "RemoveContainer" containerID="ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.468554 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd68dd9b8-z62zt" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.492692 4865 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca38ca20-0d35-4058-b0f6-bbe4251c6aab-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.537769 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd68dd9b8-z62zt"] Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.544880 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bd68dd9b8-z62zt"] Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.636917 4865 scope.go:117] "RemoveContainer" containerID="ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.679374 4865 scope.go:117] "RemoveContainer" containerID="ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217" Dec 05 06:14:37 crc kubenswrapper[4865]: E1205 06:14:37.679994 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217\": container with ID starting with ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217 not found: ID does not exist" containerID="ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.680029 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217"} err="failed to get container status \"ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217\": rpc error: code = NotFound desc = could not find container \"ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217\": container with ID starting with ec7463e2f2848e1408401056294c94afe477842efd894f3e32e53fc184fec217 not found: ID does not exist" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.680056 4865 scope.go:117] "RemoveContainer" containerID="ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f" Dec 05 06:14:37 crc kubenswrapper[4865]: E1205 06:14:37.680243 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f\": container with ID starting with ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f not found: ID does not exist" containerID="ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f" Dec 05 06:14:37 crc kubenswrapper[4865]: I1205 06:14:37.680258 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f"} err="failed to get container status \"ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f\": rpc error: code = NotFound desc = could not find container \"ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f\": container with ID starting with ffdf18b423c11d0cd7a29a406d546d87ab5e2562b00f5fe331d0abd8d62eb69f not found: ID does not exist" Dec 05 06:14:38 crc kubenswrapper[4865]: I1205 06:14:38.789414 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 06:14:39 crc kubenswrapper[4865]: I1205 06:14:39.016270 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" path="/var/lib/kubelet/pods/ca38ca20-0d35-4058-b0f6-bbe4251c6aab/volumes" Dec 05 06:14:40 crc kubenswrapper[4865]: I1205 06:14:40.048682 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 06:14:40 crc kubenswrapper[4865]: I1205 06:14:40.048986 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 06:14:41 crc kubenswrapper[4865]: I1205 06:14:41.049062 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:14:41 crc kubenswrapper[4865]: I1205 06:14:41.049132 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:14:42 crc kubenswrapper[4865]: I1205 06:14:42.848578 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 06:14:43 crc kubenswrapper[4865]: I1205 06:14:43.789752 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 06:14:43 crc kubenswrapper[4865]: I1205 06:14:43.817144 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 06:14:44 crc kubenswrapper[4865]: I1205 06:14:44.561106 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 06:14:45 crc kubenswrapper[4865]: I1205 06:14:45.034806 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:14:45 crc kubenswrapper[4865]: I1205 06:14:45.035183 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:14:45 crc kubenswrapper[4865]: I1205 06:14:45.048872 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 06:14:45 crc kubenswrapper[4865]: I1205 06:14:45.048925 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 06:14:46 crc kubenswrapper[4865]: I1205 06:14:46.141081 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:14:46 crc kubenswrapper[4865]: I1205 06:14:46.141081 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:14:46 crc kubenswrapper[4865]: I1205 06:14:46.141479 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:14:46 crc kubenswrapper[4865]: I1205 06:14:46.141506 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:14:50 crc kubenswrapper[4865]: W1205 06:14:50.907442 4865 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1ec84b_8bc5_47d2_aba2_17d2bcba16b2.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1ec84b_8bc5_47d2_aba2_17d2bcba16b2.slice: no such file or directory Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.241593 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.360472 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btwfn\" (UniqueName: \"kubernetes.io/projected/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-kube-api-access-btwfn\") pod \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.360784 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-config-data\") pod \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.360988 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-combined-ca-bundle\") pod \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\" (UID: \"4e9b57b4-0806-43ad-8cb6-881ee1854ab5\") " Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.370255 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-kube-api-access-btwfn" (OuterVolumeSpecName: "kube-api-access-btwfn") pod "4e9b57b4-0806-43ad-8cb6-881ee1854ab5" (UID: "4e9b57b4-0806-43ad-8cb6-881ee1854ab5"). InnerVolumeSpecName "kube-api-access-btwfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.397154 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e9b57b4-0806-43ad-8cb6-881ee1854ab5" (UID: "4e9b57b4-0806-43ad-8cb6-881ee1854ab5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.406454 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-config-data" (OuterVolumeSpecName: "config-data") pod "4e9b57b4-0806-43ad-8cb6-881ee1854ab5" (UID: "4e9b57b4-0806-43ad-8cb6-881ee1854ab5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.463204 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btwfn\" (UniqueName: \"kubernetes.io/projected/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-kube-api-access-btwfn\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.463428 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.463521 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9b57b4-0806-43ad-8cb6-881ee1854ab5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.612794 4865 generic.go:334] "Generic (PLEG): container finished" podID="4e9b57b4-0806-43ad-8cb6-881ee1854ab5" containerID="a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281" exitCode=137 Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.612856 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4e9b57b4-0806-43ad-8cb6-881ee1854ab5","Type":"ContainerDied","Data":"a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281"} Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.612887 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4e9b57b4-0806-43ad-8cb6-881ee1854ab5","Type":"ContainerDied","Data":"ef90a388eeab531b534d0a622ebad3894ddd76014ab986310d9ab47522486e05"} Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.612904 4865 scope.go:117] "RemoveContainer" containerID="a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.613428 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.658551 4865 scope.go:117] "RemoveContainer" containerID="a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281" Dec 05 06:14:51 crc kubenswrapper[4865]: E1205 06:14:51.659259 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281\": container with ID starting with a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281 not found: ID does not exist" containerID="a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.659293 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281"} err="failed to get container status \"a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281\": rpc error: code = NotFound desc = could not find container \"a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281\": container with ID starting with a45670bb6aa76fe78307f8f0d28430d5162b3bb373021b0df7db512ed5f42281 not found: ID does not exist" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.674388 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.687554 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.699703 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:51 crc kubenswrapper[4865]: E1205 06:14:51.700211 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700230 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" Dec 05 06:14:51 crc kubenswrapper[4865]: E1205 06:14:51.700248 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon-log" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700255 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon-log" Dec 05 06:14:51 crc kubenswrapper[4865]: E1205 06:14:51.700269 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700276 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" Dec 05 06:14:51 crc kubenswrapper[4865]: E1205 06:14:51.700295 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9b57b4-0806-43ad-8cb6-881ee1854ab5" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700301 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9b57b4-0806-43ad-8cb6-881ee1854ab5" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700496 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon-log" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700520 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.700529 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9b57b4-0806-43ad-8cb6-881ee1854ab5" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.701249 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.708422 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.728233 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.728443 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.728966 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.769173 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.769272 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.769292 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.769312 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.769326 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2fq\" (UniqueName: \"kubernetes.io/projected/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-kube-api-access-vk2fq\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.871164 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.871379 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.871420 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.871453 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.871485 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2fq\" (UniqueName: \"kubernetes.io/projected/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-kube-api-access-vk2fq\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.876619 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.877382 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.877520 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.878486 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:51 crc kubenswrapper[4865]: I1205 06:14:51.903541 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2fq\" (UniqueName: \"kubernetes.io/projected/bd86e12e-6ef3-41e5-9f84-e8d45ddaead0-kube-api-access-vk2fq\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:52 crc kubenswrapper[4865]: I1205 06:14:52.045683 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:52 crc kubenswrapper[4865]: I1205 06:14:52.609014 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 06:14:52 crc kubenswrapper[4865]: I1205 06:14:52.629211 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0","Type":"ContainerStarted","Data":"846ba1b8985b8dd37efbdc80d440a34930897f00504f71f762d752e56a833abd"} Dec 05 06:14:53 crc kubenswrapper[4865]: I1205 06:14:53.017709 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9b57b4-0806-43ad-8cb6-881ee1854ab5" path="/var/lib/kubelet/pods/4e9b57b4-0806-43ad-8cb6-881ee1854ab5/volumes" Dec 05 06:14:53 crc kubenswrapper[4865]: I1205 06:14:53.643431 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bd86e12e-6ef3-41e5-9f84-e8d45ddaead0","Type":"ContainerStarted","Data":"236eebf9f1b9ccca0df8e3654f9fcefc4ac851d28bacdfe995bc76395e7a5974"} Dec 05 06:14:53 crc kubenswrapper[4865]: I1205 06:14:53.672879 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6728619890000003 podStartE2EDuration="2.672861989s" podCreationTimestamp="2025-12-05 06:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:53.672101498 +0000 UTC m=+1312.952112760" watchObservedRunningTime="2025-12-05 06:14:53.672861989 +0000 UTC m=+1312.952873211" Dec 05 06:14:54 crc kubenswrapper[4865]: I1205 06:14:54.657953 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.039559 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.040755 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.040855 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.046241 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.054760 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.055816 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.072169 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.662029 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.665063 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.669901 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.915840 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fl7gw"] Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.916404 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca38ca20-0d35-4058-b0f6-bbe4251c6aab" containerName="horizon" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.917596 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954134 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954207 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954277 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954303 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954351 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-config\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954379 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj4vj\" (UniqueName: \"kubernetes.io/projected/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-kube-api-access-sj4vj\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:55 crc kubenswrapper[4865]: I1205 06:14:55.954735 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fl7gw"] Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.058955 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.059090 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.059206 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.059240 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.059340 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-config\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.059369 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj4vj\" (UniqueName: \"kubernetes.io/projected/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-kube-api-access-sj4vj\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.060941 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.061545 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-config\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.061846 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.062380 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.062595 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.092955 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj4vj\" (UniqueName: \"kubernetes.io/projected/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-kube-api-access-sj4vj\") pod \"dnsmasq-dns-5c7b6c5df9-fl7gw\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.234987 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:56 crc kubenswrapper[4865]: I1205 06:14:56.753096 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fl7gw"] Dec 05 06:14:56 crc kubenswrapper[4865]: W1205 06:14:56.758219 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36008d4f_e5dd_4cb6_9dc3_cd577bc48a5f.slice/crio-ac0c5d344f2ca920ba6b4eb22b0ff85c49ccfbd5c640e6557ec4ae3744388dfc WatchSource:0}: Error finding container ac0c5d344f2ca920ba6b4eb22b0ff85c49ccfbd5c640e6557ec4ae3744388dfc: Status 404 returned error can't find the container with id ac0c5d344f2ca920ba6b4eb22b0ff85c49ccfbd5c640e6557ec4ae3744388dfc Dec 05 06:14:57 crc kubenswrapper[4865]: I1205 06:14:57.045801 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:14:57 crc kubenswrapper[4865]: I1205 06:14:57.678194 4865 generic.go:334] "Generic (PLEG): container finished" podID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerID="19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821" exitCode=0 Dec 05 06:14:57 crc kubenswrapper[4865]: I1205 06:14:57.678250 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" event={"ID":"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f","Type":"ContainerDied","Data":"19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821"} Dec 05 06:14:57 crc kubenswrapper[4865]: I1205 06:14:57.678310 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" event={"ID":"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f","Type":"ContainerStarted","Data":"ac0c5d344f2ca920ba6b4eb22b0ff85c49ccfbd5c640e6557ec4ae3744388dfc"} Dec 05 06:14:58 crc kubenswrapper[4865]: I1205 06:14:58.690533 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" event={"ID":"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f","Type":"ContainerStarted","Data":"52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489"} Dec 05 06:14:58 crc kubenswrapper[4865]: I1205 06:14:58.691158 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:14:58 crc kubenswrapper[4865]: I1205 06:14:58.709560 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" podStartSLOduration=3.709538412 podStartE2EDuration="3.709538412s" podCreationTimestamp="2025-12-05 06:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:14:58.707641979 +0000 UTC m=+1317.987653211" watchObservedRunningTime="2025-12-05 06:14:58.709538412 +0000 UTC m=+1317.989549634" Dec 05 06:14:59 crc kubenswrapper[4865]: I1205 06:14:59.258517 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:14:59 crc kubenswrapper[4865]: I1205 06:14:59.259144 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-log" containerID="cri-o://e33bacbf31fa71f0a371f9ad6ab55d7ab46bd527ce59043001854f1f0a0275ec" gracePeriod=30 Dec 05 06:14:59 crc kubenswrapper[4865]: I1205 06:14:59.259221 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-api" containerID="cri-o://238336844db3246355c27c466ef2589189987c5f67815166ed3cce4c50bac7d4" gracePeriod=30 Dec 05 06:14:59 crc kubenswrapper[4865]: I1205 06:14:59.701399 4865 generic.go:334] "Generic (PLEG): container finished" podID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerID="e33bacbf31fa71f0a371f9ad6ab55d7ab46bd527ce59043001854f1f0a0275ec" exitCode=143 Dec 05 06:14:59 crc kubenswrapper[4865]: I1205 06:14:59.701475 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cfe6c68c-e7ca-4000-92e1-47b23aaf3045","Type":"ContainerDied","Data":"e33bacbf31fa71f0a371f9ad6ab55d7ab46bd527ce59043001854f1f0a0275ec"} Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.177260 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv"] Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.178623 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.181145 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.181472 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.191664 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv"] Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.286120 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88440d1b-88fe-4432-a09d-df871904d502-secret-volume\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.298069 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88440d1b-88fe-4432-a09d-df871904d502-config-volume\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.299230 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sm6\" (UniqueName: \"kubernetes.io/projected/88440d1b-88fe-4432-a09d-df871904d502-kube-api-access-p8sm6\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.401383 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88440d1b-88fe-4432-a09d-df871904d502-secret-volume\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.401438 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88440d1b-88fe-4432-a09d-df871904d502-config-volume\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.401503 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sm6\" (UniqueName: \"kubernetes.io/projected/88440d1b-88fe-4432-a09d-df871904d502-kube-api-access-p8sm6\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.402469 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88440d1b-88fe-4432-a09d-df871904d502-config-volume\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.417577 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88440d1b-88fe-4432-a09d-df871904d502-secret-volume\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.436741 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sm6\" (UniqueName: \"kubernetes.io/projected/88440d1b-88fe-4432-a09d-df871904d502-kube-api-access-p8sm6\") pod \"collect-profiles-29415255-c89vv\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:00 crc kubenswrapper[4865]: I1205 06:15:00.537915 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.100596 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv"] Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.126295 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.126846 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-central-agent" containerID="cri-o://26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a" gracePeriod=30 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.127004 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="proxy-httpd" containerID="cri-o://d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f" gracePeriod=30 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.127052 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="sg-core" containerID="cri-o://c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87" gracePeriod=30 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.127125 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-notification-agent" containerID="cri-o://2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5" gracePeriod=30 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.745386 4865 generic.go:334] "Generic (PLEG): container finished" podID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerID="d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f" exitCode=0 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.745701 4865 generic.go:334] "Generic (PLEG): container finished" podID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerID="c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87" exitCode=2 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.745712 4865 generic.go:334] "Generic (PLEG): container finished" podID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerID="26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a" exitCode=0 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.745446 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerDied","Data":"d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f"} Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.745793 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerDied","Data":"c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87"} Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.745814 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerDied","Data":"26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a"} Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.748197 4865 generic.go:334] "Generic (PLEG): container finished" podID="88440d1b-88fe-4432-a09d-df871904d502" containerID="9cf6a947e61a32670b42daa890ce63a443c01d59a593e3f1cdbc031c5047aaa3" exitCode=0 Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.748270 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" event={"ID":"88440d1b-88fe-4432-a09d-df871904d502","Type":"ContainerDied","Data":"9cf6a947e61a32670b42daa890ce63a443c01d59a593e3f1cdbc031c5047aaa3"} Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.748449 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" event={"ID":"88440d1b-88fe-4432-a09d-df871904d502","Type":"ContainerStarted","Data":"594ecc6f017945a1b86dd053c79134fd598d0e1f8a68ad4029ec1c9042696391"} Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.876387 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fg748"] Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.882338 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:01 crc kubenswrapper[4865]: I1205 06:15:01.893399 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fg748"] Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.033905 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-utilities\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.034301 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-catalog-content\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.034348 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckrg\" (UniqueName: \"kubernetes.io/projected/b2bc4e34-9781-48f0-84ac-3a7d3e583311-kube-api-access-bckrg\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.047065 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.074685 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.135770 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-utilities\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.136272 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-utilities\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.137112 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-catalog-content\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.137280 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bckrg\" (UniqueName: \"kubernetes.io/projected/b2bc4e34-9781-48f0-84ac-3a7d3e583311-kube-api-access-bckrg\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.137347 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-catalog-content\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.163753 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckrg\" (UniqueName: \"kubernetes.io/projected/b2bc4e34-9781-48f0-84ac-3a7d3e583311-kube-api-access-bckrg\") pod \"redhat-operators-fg748\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.240227 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.758522 4865 generic.go:334] "Generic (PLEG): container finished" podID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerID="238336844db3246355c27c466ef2589189987c5f67815166ed3cce4c50bac7d4" exitCode=0 Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.758948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cfe6c68c-e7ca-4000-92e1-47b23aaf3045","Type":"ContainerDied","Data":"238336844db3246355c27c466ef2589189987c5f67815166ed3cce4c50bac7d4"} Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.820375 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fg748"] Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.842395 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 06:15:02 crc kubenswrapper[4865]: I1205 06:15:02.960399 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.086511 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-config-data\") pod \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.086712 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-combined-ca-bundle\") pod \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.086765 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-logs\") pod \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.086802 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4f4w\" (UniqueName: \"kubernetes.io/projected/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-kube-api-access-m4f4w\") pod \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\" (UID: \"cfe6c68c-e7ca-4000-92e1-47b23aaf3045\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.088313 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-logs" (OuterVolumeSpecName: "logs") pod "cfe6c68c-e7ca-4000-92e1-47b23aaf3045" (UID: "cfe6c68c-e7ca-4000-92e1-47b23aaf3045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.110597 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-kube-api-access-m4f4w" (OuterVolumeSpecName: "kube-api-access-m4f4w") pod "cfe6c68c-e7ca-4000-92e1-47b23aaf3045" (UID: "cfe6c68c-e7ca-4000-92e1-47b23aaf3045"). InnerVolumeSpecName "kube-api-access-m4f4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.165582 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfe6c68c-e7ca-4000-92e1-47b23aaf3045" (UID: "cfe6c68c-e7ca-4000-92e1-47b23aaf3045"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.189276 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vfgpb"] Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.190001 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-log" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.190014 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-log" Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.190031 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-api" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.190037 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-api" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.190245 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-log" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.190272 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" containerName="nova-api-api" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.190957 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.193089 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.193104 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.193115 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4f4w\" (UniqueName: \"kubernetes.io/projected/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-kube-api-access-m4f4w\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.201020 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.202602 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.203647 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vfgpb"] Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.282426 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.285499 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-config-data" (OuterVolumeSpecName: "config-data") pod "cfe6c68c-e7ca-4000-92e1-47b23aaf3045" (UID: "cfe6c68c-e7ca-4000-92e1-47b23aaf3045"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.294875 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-config-data\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.294928 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.301855 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-scripts\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.302004 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq5mc\" (UniqueName: \"kubernetes.io/projected/573cd206-8f29-473e-8394-e862c8ef17e5-kube-api-access-dq5mc\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.302181 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe6c68c-e7ca-4000-92e1-47b23aaf3045-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.404792 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88440d1b-88fe-4432-a09d-df871904d502-config-volume\") pod \"88440d1b-88fe-4432-a09d-df871904d502\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.407230 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88440d1b-88fe-4432-a09d-df871904d502-config-volume" (OuterVolumeSpecName: "config-volume") pod "88440d1b-88fe-4432-a09d-df871904d502" (UID: "88440d1b-88fe-4432-a09d-df871904d502"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.410798 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8sm6\" (UniqueName: \"kubernetes.io/projected/88440d1b-88fe-4432-a09d-df871904d502-kube-api-access-p8sm6\") pod \"88440d1b-88fe-4432-a09d-df871904d502\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.411027 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88440d1b-88fe-4432-a09d-df871904d502-secret-volume\") pod \"88440d1b-88fe-4432-a09d-df871904d502\" (UID: \"88440d1b-88fe-4432-a09d-df871904d502\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.411249 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.411529 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-scripts\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.411614 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq5mc\" (UniqueName: \"kubernetes.io/projected/573cd206-8f29-473e-8394-e862c8ef17e5-kube-api-access-dq5mc\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.411736 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-config-data\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.411789 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88440d1b-88fe-4432-a09d-df871904d502-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.416957 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88440d1b-88fe-4432-a09d-df871904d502-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "88440d1b-88fe-4432-a09d-df871904d502" (UID: "88440d1b-88fe-4432-a09d-df871904d502"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.417480 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88440d1b-88fe-4432-a09d-df871904d502-kube-api-access-p8sm6" (OuterVolumeSpecName: "kube-api-access-p8sm6") pod "88440d1b-88fe-4432-a09d-df871904d502" (UID: "88440d1b-88fe-4432-a09d-df871904d502"). InnerVolumeSpecName "kube-api-access-p8sm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.418761 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.419214 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-scripts\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.423180 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-config-data\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.435799 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq5mc\" (UniqueName: \"kubernetes.io/projected/573cd206-8f29-473e-8394-e862c8ef17e5-kube-api-access-dq5mc\") pod \"nova-cell1-cell-mapping-vfgpb\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.513939 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8sm6\" (UniqueName: \"kubernetes.io/projected/88440d1b-88fe-4432-a09d-df871904d502-kube-api-access-p8sm6\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.513975 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88440d1b-88fe-4432-a09d-df871904d502-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.550147 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.670938 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.771526 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cfe6c68c-e7ca-4000-92e1-47b23aaf3045","Type":"ContainerDied","Data":"f06e482d35d76d8c80b0b87d3c93e71bc0bd52201a5bf003e2bc02b92eeceffe"} Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.771876 4865 scope.go:117] "RemoveContainer" containerID="238336844db3246355c27c466ef2589189987c5f67815166ed3cce4c50bac7d4" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.772029 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.801221 4865 generic.go:334] "Generic (PLEG): container finished" podID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerID="9814487d2e8babeb60ecd2e3c6645f84fd96fac32da57300c556b365b8b345f4" exitCode=0 Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.801284 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerDied","Data":"9814487d2e8babeb60ecd2e3c6645f84fd96fac32da57300c556b365b8b345f4"} Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.801310 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerStarted","Data":"1cc7ac23e350206ac090f6c1229214d9ef4c1b0889e82eab9cd9a8b82518d93c"} Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822418 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-sg-core-conf-yaml\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822454 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-config-data\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822493 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-run-httpd\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822519 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-combined-ca-bundle\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822534 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-log-httpd\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822590 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-scripts\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.822650 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drkbf\" (UniqueName: \"kubernetes.io/projected/f7834dad-27c7-4da7-9e87-1d5196269fe4-kube-api-access-drkbf\") pod \"f7834dad-27c7-4da7-9e87-1d5196269fe4\" (UID: \"f7834dad-27c7-4da7-9e87-1d5196269fe4\") " Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.826441 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" event={"ID":"88440d1b-88fe-4432-a09d-df871904d502","Type":"ContainerDied","Data":"594ecc6f017945a1b86dd053c79134fd598d0e1f8a68ad4029ec1c9042696391"} Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.826541 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="594ecc6f017945a1b86dd053c79134fd598d0e1f8a68ad4029ec1c9042696391" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.826681 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.854097 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.862110 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.864268 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7834dad-27c7-4da7-9e87-1d5196269fe4-kube-api-access-drkbf" (OuterVolumeSpecName: "kube-api-access-drkbf") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "kube-api-access-drkbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.864507 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.877983 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-scripts" (OuterVolumeSpecName: "scripts") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.895225 4865 scope.go:117] "RemoveContainer" containerID="e33bacbf31fa71f0a371f9ad6ab55d7ab46bd527ce59043001854f1f0a0275ec" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.926211 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.928451 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drkbf\" (UniqueName: \"kubernetes.io/projected/f7834dad-27c7-4da7-9e87-1d5196269fe4-kube-api-access-drkbf\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.928473 4865 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.928482 4865 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7834dad-27c7-4da7-9e87-1d5196269fe4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.928494 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.934723 4865 generic.go:334] "Generic (PLEG): container finished" podID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerID="2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5" exitCode=0 Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.934796 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.934817 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerDied","Data":"2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5"} Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.935555 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7834dad-27c7-4da7-9e87-1d5196269fe4","Type":"ContainerDied","Data":"90e38e3e66851ebf348411d45a965412324644051dc86dc780020ffd27b3860a"} Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.940101 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.941102 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="proxy-httpd" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.941189 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="proxy-httpd" Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.941274 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88440d1b-88fe-4432-a09d-df871904d502" containerName="collect-profiles" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.941348 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="88440d1b-88fe-4432-a09d-df871904d502" containerName="collect-profiles" Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.941459 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="sg-core" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.941616 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="sg-core" Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.941748 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-central-agent" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.941916 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-central-agent" Dec 05 06:15:03 crc kubenswrapper[4865]: E1205 06:15:03.942012 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-notification-agent" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.942089 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-notification-agent" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.942366 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-central-agent" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.942436 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="88440d1b-88fe-4432-a09d-df871904d502" containerName="collect-profiles" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.942518 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="proxy-httpd" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.942996 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="sg-core" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.943145 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" containerName="ceilometer-notification-agent" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.948834 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.960154 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.961846 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.963603 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.985418 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:03 crc kubenswrapper[4865]: I1205 06:15:03.987611 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.015036 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.029671 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.029776 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-config-data\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.029894 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.029932 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8907607c-d42a-49ac-9df0-9da2ceb015eb-logs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.029952 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9sm\" (UniqueName: \"kubernetes.io/projected/8907607c-d42a-49ac-9df0-9da2ceb015eb-kube-api-access-jj9sm\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.030026 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.030074 4865 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.030087 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.049209 4865 scope.go:117] "RemoveContainer" containerID="d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.069958 4865 scope.go:117] "RemoveContainer" containerID="c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.084932 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-config-data" (OuterVolumeSpecName: "config-data") pod "f7834dad-27c7-4da7-9e87-1d5196269fe4" (UID: "f7834dad-27c7-4da7-9e87-1d5196269fe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.107313 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vfgpb"] Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.122133 4865 scope.go:117] "RemoveContainer" containerID="2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132325 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132400 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132459 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-config-data\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132537 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132591 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8907607c-d42a-49ac-9df0-9da2ceb015eb-logs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132615 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9sm\" (UniqueName: \"kubernetes.io/projected/8907607c-d42a-49ac-9df0-9da2ceb015eb-kube-api-access-jj9sm\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.132722 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7834dad-27c7-4da7-9e87-1d5196269fe4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.134066 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8907607c-d42a-49ac-9df0-9da2ceb015eb-logs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.143896 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.148189 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.148323 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.149153 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-config-data\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.157519 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9sm\" (UniqueName: \"kubernetes.io/projected/8907607c-d42a-49ac-9df0-9da2ceb015eb-kube-api-access-jj9sm\") pod \"nova-api-0\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.159650 4865 scope.go:117] "RemoveContainer" containerID="26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.205600 4865 scope.go:117] "RemoveContainer" containerID="d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f" Dec 05 06:15:04 crc kubenswrapper[4865]: E1205 06:15:04.207140 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f\": container with ID starting with d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f not found: ID does not exist" containerID="d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.207189 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f"} err="failed to get container status \"d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f\": rpc error: code = NotFound desc = could not find container \"d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f\": container with ID starting with d1790cc075eaf351478d388dbb84bec06d1ed3ab500165ba61b4978fc6ebaf9f not found: ID does not exist" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.207217 4865 scope.go:117] "RemoveContainer" containerID="c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87" Dec 05 06:15:04 crc kubenswrapper[4865]: E1205 06:15:04.207448 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87\": container with ID starting with c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87 not found: ID does not exist" containerID="c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.207474 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87"} err="failed to get container status \"c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87\": rpc error: code = NotFound desc = could not find container \"c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87\": container with ID starting with c65419b3db36eed33b8240e2d190a8e80c2e824b7738cb8308d6481e027e6c87 not found: ID does not exist" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.207490 4865 scope.go:117] "RemoveContainer" containerID="2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5" Dec 05 06:15:04 crc kubenswrapper[4865]: E1205 06:15:04.207723 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5\": container with ID starting with 2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5 not found: ID does not exist" containerID="2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.207748 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5"} err="failed to get container status \"2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5\": rpc error: code = NotFound desc = could not find container \"2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5\": container with ID starting with 2b6f757c83e0c97cad4df3d2c88efced23e4f88e29586484c9852217cf4cd6e5 not found: ID does not exist" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.207765 4865 scope.go:117] "RemoveContainer" containerID="26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a" Dec 05 06:15:04 crc kubenswrapper[4865]: E1205 06:15:04.208102 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a\": container with ID starting with 26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a not found: ID does not exist" containerID="26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.208127 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a"} err="failed to get container status \"26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a\": rpc error: code = NotFound desc = could not find container \"26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a\": container with ID starting with 26dcf7afe8f1130270983c82cec445391c3d56b2814ce108531cf02e93897d0a not found: ID does not exist" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.310533 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.336383 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.351521 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.360481 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.362927 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.366367 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.366630 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441560 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441618 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-scripts\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441658 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441763 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wcsp\" (UniqueName: \"kubernetes.io/projected/c25abf96-27e6-4918-ac97-949d973cc542-kube-api-access-8wcsp\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441800 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-log-httpd\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441863 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-run-httpd\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.441996 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-config-data\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.487865 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.548557 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wcsp\" (UniqueName: \"kubernetes.io/projected/c25abf96-27e6-4918-ac97-949d973cc542-kube-api-access-8wcsp\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.551883 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-log-httpd\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.551992 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-run-httpd\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.552138 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-config-data\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.552202 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.552235 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-scripts\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.552271 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.558812 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-log-httpd\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.560651 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-run-httpd\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.565365 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-config-data\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.570024 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.571528 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-scripts\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.574548 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.599698 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wcsp\" (UniqueName: \"kubernetes.io/projected/c25abf96-27e6-4918-ac97-949d973cc542-kube-api-access-8wcsp\") pod \"ceilometer-0\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.619549 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.619775 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1b43a6d5-6af1-413b-bfec-2607a76cc294" containerName="kube-state-metrics" containerID="cri-o://e0786d153574d957a17e2fcdce796e99b55e3d061c8fa99019a1541f6b65c006" gracePeriod=30 Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.763741 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.982318 4865 generic.go:334] "Generic (PLEG): container finished" podID="1b43a6d5-6af1-413b-bfec-2607a76cc294" containerID="e0786d153574d957a17e2fcdce796e99b55e3d061c8fa99019a1541f6b65c006" exitCode=2 Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.982413 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b43a6d5-6af1-413b-bfec-2607a76cc294","Type":"ContainerDied","Data":"e0786d153574d957a17e2fcdce796e99b55e3d061c8fa99019a1541f6b65c006"} Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.999119 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vfgpb" event={"ID":"573cd206-8f29-473e-8394-e862c8ef17e5","Type":"ContainerStarted","Data":"34bfcd69dd61aeb075770fc5b52a4b2474415129d3302513db78c50edaffde24"} Dec 05 06:15:04 crc kubenswrapper[4865]: I1205 06:15:04.999171 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vfgpb" event={"ID":"573cd206-8f29-473e-8394-e862c8ef17e5","Type":"ContainerStarted","Data":"2ace43df90b6160600c22cdbd8b92bc272d375bddcbff68fa0dfc9e8dc625d7a"} Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.032904 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vfgpb" podStartSLOduration=2.032882249 podStartE2EDuration="2.032882249s" podCreationTimestamp="2025-12-05 06:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:15:05.019191123 +0000 UTC m=+1324.299202345" watchObservedRunningTime="2025-12-05 06:15:05.032882249 +0000 UTC m=+1324.312893471" Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.037174 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe6c68c-e7ca-4000-92e1-47b23aaf3045" path="/var/lib/kubelet/pods/cfe6c68c-e7ca-4000-92e1-47b23aaf3045/volumes" Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.037841 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7834dad-27c7-4da7-9e87-1d5196269fe4" path="/var/lib/kubelet/pods/f7834dad-27c7-4da7-9e87-1d5196269fe4/volumes" Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.038536 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:05 crc kubenswrapper[4865]: W1205 06:15:05.042456 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8907607c_d42a_49ac_9df0_9da2ceb015eb.slice/crio-d2011d0cfe17a420d521709597ba7658d09d96736e06e9a230e5b6c0f00f1398 WatchSource:0}: Error finding container d2011d0cfe17a420d521709597ba7658d09d96736e06e9a230e5b6c0f00f1398: Status 404 returned error can't find the container with id d2011d0cfe17a420d521709597ba7658d09d96736e06e9a230e5b6c0f00f1398 Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.158195 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.270793 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgdbq\" (UniqueName: \"kubernetes.io/projected/1b43a6d5-6af1-413b-bfec-2607a76cc294-kube-api-access-jgdbq\") pod \"1b43a6d5-6af1-413b-bfec-2607a76cc294\" (UID: \"1b43a6d5-6af1-413b-bfec-2607a76cc294\") " Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.282653 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b43a6d5-6af1-413b-bfec-2607a76cc294-kube-api-access-jgdbq" (OuterVolumeSpecName: "kube-api-access-jgdbq") pod "1b43a6d5-6af1-413b-bfec-2607a76cc294" (UID: "1b43a6d5-6af1-413b-bfec-2607a76cc294"). InnerVolumeSpecName "kube-api-access-jgdbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.372940 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgdbq\" (UniqueName: \"kubernetes.io/projected/1b43a6d5-6af1-413b-bfec-2607a76cc294-kube-api-access-jgdbq\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:05 crc kubenswrapper[4865]: I1205 06:15:05.388974 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.010386 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.010394 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b43a6d5-6af1-413b-bfec-2607a76cc294","Type":"ContainerDied","Data":"08f986d3d2f8c3b4cc69f7db04e2449ed44b4b1a0c0b93ce522cd40e1d1dc098"} Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.011108 4865 scope.go:117] "RemoveContainer" containerID="e0786d153574d957a17e2fcdce796e99b55e3d061c8fa99019a1541f6b65c006" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.014409 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8907607c-d42a-49ac-9df0-9da2ceb015eb","Type":"ContainerStarted","Data":"e537772632cb4790421fca55c1d4c51d9a4a6a36fc9af49da4f0e3ed4d7ed409"} Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.014450 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8907607c-d42a-49ac-9df0-9da2ceb015eb","Type":"ContainerStarted","Data":"64aa5cc6ad8fefad00a3910e973949f379f54bb823ca86c86598cf86018c52dd"} Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.014461 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8907607c-d42a-49ac-9df0-9da2ceb015eb","Type":"ContainerStarted","Data":"d2011d0cfe17a420d521709597ba7658d09d96736e06e9a230e5b6c0f00f1398"} Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.017382 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerStarted","Data":"a2474c4315faca12a060f6ab8dac8141db329fd18a9f47d202c784576bfbf578"} Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.021607 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerStarted","Data":"98a77a37d33dc31198597d3dc86dda49255b55628bc835e6f9ff7981b01dc9f3"} Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.053695 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.053675335 podStartE2EDuration="3.053675335s" podCreationTimestamp="2025-12-05 06:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:15:06.04295912 +0000 UTC m=+1325.322970342" watchObservedRunningTime="2025-12-05 06:15:06.053675335 +0000 UTC m=+1325.333686557" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.105817 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.122802 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.136285 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:15:06 crc kubenswrapper[4865]: E1205 06:15:06.136809 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b43a6d5-6af1-413b-bfec-2607a76cc294" containerName="kube-state-metrics" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.136843 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b43a6d5-6af1-413b-bfec-2607a76cc294" containerName="kube-state-metrics" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.137065 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b43a6d5-6af1-413b-bfec-2607a76cc294" containerName="kube-state-metrics" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.137844 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.145250 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.148492 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.148771 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.238095 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.288714 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.288781 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d42bf\" (UniqueName: \"kubernetes.io/projected/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-api-access-d42bf\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.288812 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.288845 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.313573 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8pvjj"] Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.313879 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" podUID="929b303b-d676-4548-9186-c29a7921cb8d" containerName="dnsmasq-dns" containerID="cri-o://cc4f42d998e4e3ff72e66cf93d52231d1d6927ca57713c318baaaebdbed7af19" gracePeriod=10 Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.393117 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.394190 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d42bf\" (UniqueName: \"kubernetes.io/projected/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-api-access-d42bf\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.394242 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.394270 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.409231 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.411688 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.425495 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.428370 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d42bf\" (UniqueName: \"kubernetes.io/projected/df463e57-e3b9-4829-bd44-94c3ec6a90fa-kube-api-access-d42bf\") pod \"kube-state-metrics-0\" (UID: \"df463e57-e3b9-4829-bd44-94c3ec6a90fa\") " pod="openstack/kube-state-metrics-0" Dec 05 06:15:06 crc kubenswrapper[4865]: I1205 06:15:06.470860 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 06:15:07 crc kubenswrapper[4865]: I1205 06:15:07.016401 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b43a6d5-6af1-413b-bfec-2607a76cc294" path="/var/lib/kubelet/pods/1b43a6d5-6af1-413b-bfec-2607a76cc294/volumes" Dec 05 06:15:07 crc kubenswrapper[4865]: I1205 06:15:07.031266 4865 generic.go:334] "Generic (PLEG): container finished" podID="929b303b-d676-4548-9186-c29a7921cb8d" containerID="cc4f42d998e4e3ff72e66cf93d52231d1d6927ca57713c318baaaebdbed7af19" exitCode=0 Dec 05 06:15:07 crc kubenswrapper[4865]: I1205 06:15:07.031313 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" event={"ID":"929b303b-d676-4548-9186-c29a7921cb8d","Type":"ContainerDied","Data":"cc4f42d998e4e3ff72e66cf93d52231d1d6927ca57713c318baaaebdbed7af19"} Dec 05 06:15:07 crc kubenswrapper[4865]: I1205 06:15:07.583447 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 06:15:07 crc kubenswrapper[4865]: I1205 06:15:07.902709 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.034879 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-svc\") pod \"929b303b-d676-4548-9186-c29a7921cb8d\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.034950 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-swift-storage-0\") pod \"929b303b-d676-4548-9186-c29a7921cb8d\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.034995 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-config\") pod \"929b303b-d676-4548-9186-c29a7921cb8d\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.035079 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-sb\") pod \"929b303b-d676-4548-9186-c29a7921cb8d\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.035163 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46fz2\" (UniqueName: \"kubernetes.io/projected/929b303b-d676-4548-9186-c29a7921cb8d-kube-api-access-46fz2\") pod \"929b303b-d676-4548-9186-c29a7921cb8d\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.035296 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-nb\") pod \"929b303b-d676-4548-9186-c29a7921cb8d\" (UID: \"929b303b-d676-4548-9186-c29a7921cb8d\") " Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.129297 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929b303b-d676-4548-9186-c29a7921cb8d-kube-api-access-46fz2" (OuterVolumeSpecName: "kube-api-access-46fz2") pod "929b303b-d676-4548-9186-c29a7921cb8d" (UID: "929b303b-d676-4548-9186-c29a7921cb8d"). InnerVolumeSpecName "kube-api-access-46fz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.140559 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" event={"ID":"929b303b-d676-4548-9186-c29a7921cb8d","Type":"ContainerDied","Data":"d393df4122c18ee5899ed37f019fe32ae7959b9a62bcf51dbeadc32639c5e1a2"} Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.140628 4865 scope.go:117] "RemoveContainer" containerID="cc4f42d998e4e3ff72e66cf93d52231d1d6927ca57713c318baaaebdbed7af19" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.140792 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8pvjj" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.143399 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46fz2\" (UniqueName: \"kubernetes.io/projected/929b303b-d676-4548-9186-c29a7921cb8d-kube-api-access-46fz2\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.189889 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"df463e57-e3b9-4829-bd44-94c3ec6a90fa","Type":"ContainerStarted","Data":"36a8f67dcedac835c31a87fe1a54a678a2ef0ebdddb1b8b19522d40d56b7103c"} Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.191752 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerStarted","Data":"8abad379185f9c1e2a91321dbcd07e7a65581e638b884992b75c260c2a6c3be1"} Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.265162 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "929b303b-d676-4548-9186-c29a7921cb8d" (UID: "929b303b-d676-4548-9186-c29a7921cb8d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.273560 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "929b303b-d676-4548-9186-c29a7921cb8d" (UID: "929b303b-d676-4548-9186-c29a7921cb8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.305032 4865 scope.go:117] "RemoveContainer" containerID="5fd9046a7115f6ced3a8f3cbf4acd158d79abdc939f7080b26a5239176a3489b" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.312436 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "929b303b-d676-4548-9186-c29a7921cb8d" (UID: "929b303b-d676-4548-9186-c29a7921cb8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.334745 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-config" (OuterVolumeSpecName: "config") pod "929b303b-d676-4548-9186-c29a7921cb8d" (UID: "929b303b-d676-4548-9186-c29a7921cb8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.355203 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.355235 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.355244 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.355252 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.370550 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "929b303b-d676-4548-9186-c29a7921cb8d" (UID: "929b303b-d676-4548-9186-c29a7921cb8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.393389 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.456847 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/929b303b-d676-4548-9186-c29a7921cb8d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.475752 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8pvjj"] Dec 05 06:15:08 crc kubenswrapper[4865]: I1205 06:15:08.491090 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8pvjj"] Dec 05 06:15:09 crc kubenswrapper[4865]: I1205 06:15:09.018429 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="929b303b-d676-4548-9186-c29a7921cb8d" path="/var/lib/kubelet/pods/929b303b-d676-4548-9186-c29a7921cb8d/volumes" Dec 05 06:15:09 crc kubenswrapper[4865]: I1205 06:15:09.202064 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"df463e57-e3b9-4829-bd44-94c3ec6a90fa","Type":"ContainerStarted","Data":"b4051ebff31e9ad5bead0116f36fc7ed4b68faebeeca7991f10fc73ed43fbc05"} Dec 05 06:15:09 crc kubenswrapper[4865]: I1205 06:15:09.204542 4865 generic.go:334] "Generic (PLEG): container finished" podID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerID="98a77a37d33dc31198597d3dc86dda49255b55628bc835e6f9ff7981b01dc9f3" exitCode=0 Dec 05 06:15:09 crc kubenswrapper[4865]: I1205 06:15:09.204611 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerDied","Data":"98a77a37d33dc31198597d3dc86dda49255b55628bc835e6f9ff7981b01dc9f3"} Dec 05 06:15:09 crc kubenswrapper[4865]: I1205 06:15:09.209209 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerStarted","Data":"6ec36b8581d4250a63a5fcf8a2c7398a8913c045b64c12b12db1744a4cbc776b"} Dec 05 06:15:10 crc kubenswrapper[4865]: I1205 06:15:10.223551 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerStarted","Data":"5abc0ceebbe39a8c1a04192bd44fc9eb3f7e855167250463cb468c6d6b069e96"} Dec 05 06:15:10 crc kubenswrapper[4865]: I1205 06:15:10.224000 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 06:15:10 crc kubenswrapper[4865]: I1205 06:15:10.246022 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.384004248 podStartE2EDuration="4.24598388s" podCreationTimestamp="2025-12-05 06:15:06 +0000 UTC" firstStartedPulling="2025-12-05 06:15:07.662044908 +0000 UTC m=+1326.942056130" lastFinishedPulling="2025-12-05 06:15:08.52402454 +0000 UTC m=+1327.804035762" observedRunningTime="2025-12-05 06:15:10.239026229 +0000 UTC m=+1329.519037451" watchObservedRunningTime="2025-12-05 06:15:10.24598388 +0000 UTC m=+1329.525995102" Dec 05 06:15:11 crc kubenswrapper[4865]: I1205 06:15:11.054980 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:15:11 crc kubenswrapper[4865]: I1205 06:15:11.055366 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:15:11 crc kubenswrapper[4865]: I1205 06:15:11.236609 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerStarted","Data":"8dae8cda329534a9e83b2b8284879e5c08ffce22890f9016b26706788000949c"} Dec 05 06:15:11 crc kubenswrapper[4865]: I1205 06:15:11.263009 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fg748" podStartSLOduration=4.087061252 podStartE2EDuration="10.262986341s" podCreationTimestamp="2025-12-05 06:15:01 +0000 UTC" firstStartedPulling="2025-12-05 06:15:03.805138107 +0000 UTC m=+1323.085149329" lastFinishedPulling="2025-12-05 06:15:09.981063196 +0000 UTC m=+1329.261074418" observedRunningTime="2025-12-05 06:15:11.262902889 +0000 UTC m=+1330.542914121" watchObservedRunningTime="2025-12-05 06:15:11.262986341 +0000 UTC m=+1330.542997563" Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.241790 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.243144 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.266527 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerStarted","Data":"89fca49ca21b8feac967a5eeffa7bb4c719b9c5501b5233893286f0ae5f9189f"} Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.267173 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="sg-core" containerID="cri-o://5abc0ceebbe39a8c1a04192bd44fc9eb3f7e855167250463cb468c6d6b069e96" gracePeriod=30 Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.267400 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="proxy-httpd" containerID="cri-o://89fca49ca21b8feac967a5eeffa7bb4c719b9c5501b5233893286f0ae5f9189f" gracePeriod=30 Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.267580 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-notification-agent" containerID="cri-o://6ec36b8581d4250a63a5fcf8a2c7398a8913c045b64c12b12db1744a4cbc776b" gracePeriod=30 Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.267804 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-central-agent" containerID="cri-o://8abad379185f9c1e2a91321dbcd07e7a65581e638b884992b75c260c2a6c3be1" gracePeriod=30 Dec 05 06:15:12 crc kubenswrapper[4865]: I1205 06:15:12.299102 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.337618981 podStartE2EDuration="8.299081787s" podCreationTimestamp="2025-12-05 06:15:04 +0000 UTC" firstStartedPulling="2025-12-05 06:15:05.392693686 +0000 UTC m=+1324.672704908" lastFinishedPulling="2025-12-05 06:15:11.354156492 +0000 UTC m=+1330.634167714" observedRunningTime="2025-12-05 06:15:12.290198423 +0000 UTC m=+1331.570209645" watchObservedRunningTime="2025-12-05 06:15:12.299081787 +0000 UTC m=+1331.579093009" Dec 05 06:15:13 crc kubenswrapper[4865]: I1205 06:15:13.278458 4865 generic.go:334] "Generic (PLEG): container finished" podID="c25abf96-27e6-4918-ac97-949d973cc542" containerID="5abc0ceebbe39a8c1a04192bd44fc9eb3f7e855167250463cb468c6d6b069e96" exitCode=2 Dec 05 06:15:13 crc kubenswrapper[4865]: I1205 06:15:13.279460 4865 generic.go:334] "Generic (PLEG): container finished" podID="c25abf96-27e6-4918-ac97-949d973cc542" containerID="6ec36b8581d4250a63a5fcf8a2c7398a8913c045b64c12b12db1744a4cbc776b" exitCode=0 Dec 05 06:15:13 crc kubenswrapper[4865]: I1205 06:15:13.278503 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerDied","Data":"5abc0ceebbe39a8c1a04192bd44fc9eb3f7e855167250463cb468c6d6b069e96"} Dec 05 06:15:13 crc kubenswrapper[4865]: I1205 06:15:13.279617 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerDied","Data":"6ec36b8581d4250a63a5fcf8a2c7398a8913c045b64c12b12db1744a4cbc776b"} Dec 05 06:15:13 crc kubenswrapper[4865]: I1205 06:15:13.329814 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fg748" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="registry-server" probeResult="failure" output=< Dec 05 06:15:13 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:15:13 crc kubenswrapper[4865]: > Dec 05 06:15:14 crc kubenswrapper[4865]: I1205 06:15:14.292610 4865 generic.go:334] "Generic (PLEG): container finished" podID="573cd206-8f29-473e-8394-e862c8ef17e5" containerID="34bfcd69dd61aeb075770fc5b52a4b2474415129d3302513db78c50edaffde24" exitCode=0 Dec 05 06:15:14 crc kubenswrapper[4865]: I1205 06:15:14.292703 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vfgpb" event={"ID":"573cd206-8f29-473e-8394-e862c8ef17e5","Type":"ContainerDied","Data":"34bfcd69dd61aeb075770fc5b52a4b2474415129d3302513db78c50edaffde24"} Dec 05 06:15:14 crc kubenswrapper[4865]: I1205 06:15:14.360949 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:15:14 crc kubenswrapper[4865]: I1205 06:15:14.360997 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.365032 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.369975 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.780956 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.828058 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-scripts\") pod \"573cd206-8f29-473e-8394-e862c8ef17e5\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.828221 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq5mc\" (UniqueName: \"kubernetes.io/projected/573cd206-8f29-473e-8394-e862c8ef17e5-kube-api-access-dq5mc\") pod \"573cd206-8f29-473e-8394-e862c8ef17e5\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.828276 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-config-data\") pod \"573cd206-8f29-473e-8394-e862c8ef17e5\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.828307 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-combined-ca-bundle\") pod \"573cd206-8f29-473e-8394-e862c8ef17e5\" (UID: \"573cd206-8f29-473e-8394-e862c8ef17e5\") " Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.843939 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-scripts" (OuterVolumeSpecName: "scripts") pod "573cd206-8f29-473e-8394-e862c8ef17e5" (UID: "573cd206-8f29-473e-8394-e862c8ef17e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.844368 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573cd206-8f29-473e-8394-e862c8ef17e5-kube-api-access-dq5mc" (OuterVolumeSpecName: "kube-api-access-dq5mc") pod "573cd206-8f29-473e-8394-e862c8ef17e5" (UID: "573cd206-8f29-473e-8394-e862c8ef17e5"). InnerVolumeSpecName "kube-api-access-dq5mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.881016 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "573cd206-8f29-473e-8394-e862c8ef17e5" (UID: "573cd206-8f29-473e-8394-e862c8ef17e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.919234 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-config-data" (OuterVolumeSpecName: "config-data") pod "573cd206-8f29-473e-8394-e862c8ef17e5" (UID: "573cd206-8f29-473e-8394-e862c8ef17e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.930649 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq5mc\" (UniqueName: \"kubernetes.io/projected/573cd206-8f29-473e-8394-e862c8ef17e5-kube-api-access-dq5mc\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.930681 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.930710 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:15 crc kubenswrapper[4865]: I1205 06:15:15.930721 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cd206-8f29-473e-8394-e862c8ef17e5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.321912 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vfgpb" event={"ID":"573cd206-8f29-473e-8394-e862c8ef17e5","Type":"ContainerDied","Data":"2ace43df90b6160600c22cdbd8b92bc272d375bddcbff68fa0dfc9e8dc625d7a"} Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.322183 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ace43df90b6160600c22cdbd8b92bc272d375bddcbff68fa0dfc9e8dc625d7a" Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.322038 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vfgpb" Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.489035 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.516799 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.517093 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-log" containerID="cri-o://64aa5cc6ad8fefad00a3910e973949f379f54bb823ca86c86598cf86018c52dd" gracePeriod=30 Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.517265 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-api" containerID="cri-o://e537772632cb4790421fca55c1d4c51d9a4a6a36fc9af49da4f0e3ed4d7ed409" gracePeriod=30 Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.530713 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.530995 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="87766c6c-42b5-4851-9729-29f38ed36ae5" containerName="nova-scheduler-scheduler" containerID="cri-o://e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" gracePeriod=30 Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.547726 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.550174 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-log" containerID="cri-o://7f1b9ce1734b8f703e12f26e1ba689eb26a81ec5bdc81bf08cc7045cf96572be" gracePeriod=30 Dec 05 06:15:16 crc kubenswrapper[4865]: I1205 06:15:16.550442 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-metadata" containerID="cri-o://dda00ade2362b19bae318cabdf3adaeb9bbb1de17d6321f885296bf1ceea047c" gracePeriod=30 Dec 05 06:15:17 crc kubenswrapper[4865]: I1205 06:15:17.334037 4865 generic.go:334] "Generic (PLEG): container finished" podID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerID="64aa5cc6ad8fefad00a3910e973949f379f54bb823ca86c86598cf86018c52dd" exitCode=143 Dec 05 06:15:17 crc kubenswrapper[4865]: I1205 06:15:17.334127 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8907607c-d42a-49ac-9df0-9da2ceb015eb","Type":"ContainerDied","Data":"64aa5cc6ad8fefad00a3910e973949f379f54bb823ca86c86598cf86018c52dd"} Dec 05 06:15:17 crc kubenswrapper[4865]: I1205 06:15:17.337284 4865 generic.go:334] "Generic (PLEG): container finished" podID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerID="7f1b9ce1734b8f703e12f26e1ba689eb26a81ec5bdc81bf08cc7045cf96572be" exitCode=143 Dec 05 06:15:17 crc kubenswrapper[4865]: I1205 06:15:17.337336 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5025acca-f209-4f4c-ab4e-b7e386f5c3ab","Type":"ContainerDied","Data":"7f1b9ce1734b8f703e12f26e1ba689eb26a81ec5bdc81bf08cc7045cf96572be"} Dec 05 06:15:18 crc kubenswrapper[4865]: E1205 06:15:18.791336 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 06:15:18 crc kubenswrapper[4865]: E1205 06:15:18.792863 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 06:15:18 crc kubenswrapper[4865]: E1205 06:15:18.794251 4865 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 06:15:18 crc kubenswrapper[4865]: E1205 06:15:18.794282 4865 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="87766c6c-42b5-4851-9729-29f38ed36ae5" containerName="nova-scheduler-scheduler" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.049044 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": dial tcp 10.217.0.195:8775: connect: connection refused" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.049497 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": dial tcp 10.217.0.195:8775: connect: connection refused" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.212533 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.226984 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-combined-ca-bundle\") pod \"87766c6c-42b5-4851-9729-29f38ed36ae5\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.227089 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chhxd\" (UniqueName: \"kubernetes.io/projected/87766c6c-42b5-4851-9729-29f38ed36ae5-kube-api-access-chhxd\") pod \"87766c6c-42b5-4851-9729-29f38ed36ae5\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.227124 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-config-data\") pod \"87766c6c-42b5-4851-9729-29f38ed36ae5\" (UID: \"87766c6c-42b5-4851-9729-29f38ed36ae5\") " Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.259202 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87766c6c-42b5-4851-9729-29f38ed36ae5-kube-api-access-chhxd" (OuterVolumeSpecName: "kube-api-access-chhxd") pod "87766c6c-42b5-4851-9729-29f38ed36ae5" (UID: "87766c6c-42b5-4851-9729-29f38ed36ae5"). InnerVolumeSpecName "kube-api-access-chhxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.295201 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87766c6c-42b5-4851-9729-29f38ed36ae5" (UID: "87766c6c-42b5-4851-9729-29f38ed36ae5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.298614 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-config-data" (OuterVolumeSpecName: "config-data") pod "87766c6c-42b5-4851-9729-29f38ed36ae5" (UID: "87766c6c-42b5-4851-9729-29f38ed36ae5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.329933 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.329968 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chhxd\" (UniqueName: \"kubernetes.io/projected/87766c6c-42b5-4851-9729-29f38ed36ae5-kube-api-access-chhxd\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.329979 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87766c6c-42b5-4851-9729-29f38ed36ae5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.369325 4865 generic.go:334] "Generic (PLEG): container finished" podID="c25abf96-27e6-4918-ac97-949d973cc542" containerID="8abad379185f9c1e2a91321dbcd07e7a65581e638b884992b75c260c2a6c3be1" exitCode=0 Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.369374 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerDied","Data":"8abad379185f9c1e2a91321dbcd07e7a65581e638b884992b75c260c2a6c3be1"} Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.371942 4865 generic.go:334] "Generic (PLEG): container finished" podID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerID="dda00ade2362b19bae318cabdf3adaeb9bbb1de17d6321f885296bf1ceea047c" exitCode=0 Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.372009 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5025acca-f209-4f4c-ab4e-b7e386f5c3ab","Type":"ContainerDied","Data":"dda00ade2362b19bae318cabdf3adaeb9bbb1de17d6321f885296bf1ceea047c"} Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.376215 4865 generic.go:334] "Generic (PLEG): container finished" podID="87766c6c-42b5-4851-9729-29f38ed36ae5" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" exitCode=0 Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.376267 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87766c6c-42b5-4851-9729-29f38ed36ae5","Type":"ContainerDied","Data":"e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6"} Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.376279 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.376300 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87766c6c-42b5-4851-9729-29f38ed36ae5","Type":"ContainerDied","Data":"18f245630b70c3b5ac0b82c0456f032c5c7658e2dcd408d09fdf8733029dc0e6"} Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.376337 4865 scope.go:117] "RemoveContainer" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.449197 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.456416 4865 scope.go:117] "RemoveContainer" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" Dec 05 06:15:20 crc kubenswrapper[4865]: E1205 06:15:20.456934 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6\": container with ID starting with e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6 not found: ID does not exist" containerID="e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.456971 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6"} err="failed to get container status \"e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6\": rpc error: code = NotFound desc = could not find container \"e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6\": container with ID starting with e144142e3bd1ff497b55b8908f46a023c17bb8404c904495f61e63e260de21d6 not found: ID does not exist" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.468489 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.483877 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:15:20 crc kubenswrapper[4865]: E1205 06:15:20.484270 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929b303b-d676-4548-9186-c29a7921cb8d" containerName="init" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484287 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="929b303b-d676-4548-9186-c29a7921cb8d" containerName="init" Dec 05 06:15:20 crc kubenswrapper[4865]: E1205 06:15:20.484302 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87766c6c-42b5-4851-9729-29f38ed36ae5" containerName="nova-scheduler-scheduler" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484308 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="87766c6c-42b5-4851-9729-29f38ed36ae5" containerName="nova-scheduler-scheduler" Dec 05 06:15:20 crc kubenswrapper[4865]: E1205 06:15:20.484320 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573cd206-8f29-473e-8394-e862c8ef17e5" containerName="nova-manage" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484326 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="573cd206-8f29-473e-8394-e862c8ef17e5" containerName="nova-manage" Dec 05 06:15:20 crc kubenswrapper[4865]: E1205 06:15:20.484345 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929b303b-d676-4548-9186-c29a7921cb8d" containerName="dnsmasq-dns" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484350 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="929b303b-d676-4548-9186-c29a7921cb8d" containerName="dnsmasq-dns" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484538 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="573cd206-8f29-473e-8394-e862c8ef17e5" containerName="nova-manage" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484552 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="929b303b-d676-4548-9186-c29a7921cb8d" containerName="dnsmasq-dns" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.484573 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="87766c6c-42b5-4851-9729-29f38ed36ae5" containerName="nova-scheduler-scheduler" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.485188 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.495336 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.521678 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.535113 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-config-data\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.535258 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.535300 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-759nn\" (UniqueName: \"kubernetes.io/projected/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-kube-api-access-759nn\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.637431 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.638332 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-759nn\" (UniqueName: \"kubernetes.io/projected/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-kube-api-access-759nn\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.638408 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-config-data\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.661733 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.663724 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-759nn\" (UniqueName: \"kubernetes.io/projected/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-kube-api-access-759nn\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.673442 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fdeb31-0c08-4c87-82a4-5a51af86aa1f-config-data\") pod \"nova-scheduler-0\" (UID: \"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f\") " pod="openstack/nova-scheduler-0" Dec 05 06:15:20 crc kubenswrapper[4865]: I1205 06:15:20.807716 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.033972 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87766c6c-42b5-4851-9729-29f38ed36ae5" path="/var/lib/kubelet/pods/87766c6c-42b5-4851-9729-29f38ed36ae5/volumes" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.171575 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.249586 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-nova-metadata-tls-certs\") pod \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.249662 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk6xb\" (UniqueName: \"kubernetes.io/projected/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-kube-api-access-nk6xb\") pod \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.249838 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-config-data\") pod \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.249905 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-combined-ca-bundle\") pod \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.250043 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-logs\") pod \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\" (UID: \"5025acca-f209-4f4c-ab4e-b7e386f5c3ab\") " Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.250952 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-logs" (OuterVolumeSpecName: "logs") pod "5025acca-f209-4f4c-ab4e-b7e386f5c3ab" (UID: "5025acca-f209-4f4c-ab4e-b7e386f5c3ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.260052 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-kube-api-access-nk6xb" (OuterVolumeSpecName: "kube-api-access-nk6xb") pod "5025acca-f209-4f4c-ab4e-b7e386f5c3ab" (UID: "5025acca-f209-4f4c-ab4e-b7e386f5c3ab"). InnerVolumeSpecName "kube-api-access-nk6xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.320841 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-config-data" (OuterVolumeSpecName: "config-data") pod "5025acca-f209-4f4c-ab4e-b7e386f5c3ab" (UID: "5025acca-f209-4f4c-ab4e-b7e386f5c3ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.323210 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5025acca-f209-4f4c-ab4e-b7e386f5c3ab" (UID: "5025acca-f209-4f4c-ab4e-b7e386f5c3ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.354108 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.354137 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.354149 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.354157 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk6xb\" (UniqueName: \"kubernetes.io/projected/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-kube-api-access-nk6xb\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.363585 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5025acca-f209-4f4c-ab4e-b7e386f5c3ab" (UID: "5025acca-f209-4f4c-ab4e-b7e386f5c3ab"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.421121 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5025acca-f209-4f4c-ab4e-b7e386f5c3ab","Type":"ContainerDied","Data":"686b65b7a3958c691b877b885459dc85861353f4c3031eba6e3f3efa6cb38f44"} Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.421187 4865 scope.go:117] "RemoveContainer" containerID="dda00ade2362b19bae318cabdf3adaeb9bbb1de17d6321f885296bf1ceea047c" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.421324 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.464550 4865 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5025acca-f209-4f4c-ab4e-b7e386f5c3ab-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.477207 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.490989 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.494487 4865 scope.go:117] "RemoveContainer" containerID="7f1b9ce1734b8f703e12f26e1ba689eb26a81ec5bdc81bf08cc7045cf96572be" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.514938 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.542450 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:15:21 crc kubenswrapper[4865]: E1205 06:15:21.543445 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-log" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.543458 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-log" Dec 05 06:15:21 crc kubenswrapper[4865]: E1205 06:15:21.543472 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-metadata" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.543478 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-metadata" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.575892 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-log" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.575942 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" containerName="nova-metadata-metadata" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.581268 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.581593 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.590088 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.590331 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.785692 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-config-data\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.786388 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.786446 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.786479 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668d173f-5e28-427e-a382-f905813fc91e-logs\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.786673 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqb9t\" (UniqueName: \"kubernetes.io/projected/668d173f-5e28-427e-a382-f905813fc91e-kube-api-access-gqb9t\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.888308 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-config-data\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.888912 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.888958 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.888989 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668d173f-5e28-427e-a382-f905813fc91e-logs\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.889042 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqb9t\" (UniqueName: \"kubernetes.io/projected/668d173f-5e28-427e-a382-f905813fc91e-kube-api-access-gqb9t\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.889577 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668d173f-5e28-427e-a382-f905813fc91e-logs\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.892492 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-config-data\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.892977 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.897791 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668d173f-5e28-427e-a382-f905813fc91e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.903437 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqb9t\" (UniqueName: \"kubernetes.io/projected/668d173f-5e28-427e-a382-f905813fc91e-kube-api-access-gqb9t\") pod \"nova-metadata-0\" (UID: \"668d173f-5e28-427e-a382-f905813fc91e\") " pod="openstack/nova-metadata-0" Dec 05 06:15:21 crc kubenswrapper[4865]: I1205 06:15:21.983727 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.444542 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f","Type":"ContainerStarted","Data":"84b87efa44290cd283779619ed9a277b818dc1181ee36a6ffe61a5f3b215a1bf"} Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.444894 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6fdeb31-0c08-4c87-82a4-5a51af86aa1f","Type":"ContainerStarted","Data":"a725d82b5e8588b2485a3bf6ce945517cc77659b2863c11f8442c3b874589a10"} Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.449097 4865 generic.go:334] "Generic (PLEG): container finished" podID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerID="e537772632cb4790421fca55c1d4c51d9a4a6a36fc9af49da4f0e3ed4d7ed409" exitCode=0 Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.449159 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8907607c-d42a-49ac-9df0-9da2ceb015eb","Type":"ContainerDied","Data":"e537772632cb4790421fca55c1d4c51d9a4a6a36fc9af49da4f0e3ed4d7ed409"} Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.470393 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.47037197 podStartE2EDuration="2.47037197s" podCreationTimestamp="2025-12-05 06:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:15:22.465016713 +0000 UTC m=+1341.745027935" watchObservedRunningTime="2025-12-05 06:15:22.47037197 +0000 UTC m=+1341.750383192" Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.504485 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 06:15:22 crc kubenswrapper[4865]: I1205 06:15:22.853297 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.009978 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-internal-tls-certs\") pod \"8907607c-d42a-49ac-9df0-9da2ceb015eb\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.010076 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj9sm\" (UniqueName: \"kubernetes.io/projected/8907607c-d42a-49ac-9df0-9da2ceb015eb-kube-api-access-jj9sm\") pod \"8907607c-d42a-49ac-9df0-9da2ceb015eb\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.010165 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-combined-ca-bundle\") pod \"8907607c-d42a-49ac-9df0-9da2ceb015eb\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.010211 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-public-tls-certs\") pod \"8907607c-d42a-49ac-9df0-9da2ceb015eb\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.010286 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8907607c-d42a-49ac-9df0-9da2ceb015eb-logs\") pod \"8907607c-d42a-49ac-9df0-9da2ceb015eb\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.010343 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-config-data\") pod \"8907607c-d42a-49ac-9df0-9da2ceb015eb\" (UID: \"8907607c-d42a-49ac-9df0-9da2ceb015eb\") " Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.012608 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8907607c-d42a-49ac-9df0-9da2ceb015eb-logs" (OuterVolumeSpecName: "logs") pod "8907607c-d42a-49ac-9df0-9da2ceb015eb" (UID: "8907607c-d42a-49ac-9df0-9da2ceb015eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.015152 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8907607c-d42a-49ac-9df0-9da2ceb015eb-kube-api-access-jj9sm" (OuterVolumeSpecName: "kube-api-access-jj9sm") pod "8907607c-d42a-49ac-9df0-9da2ceb015eb" (UID: "8907607c-d42a-49ac-9df0-9da2ceb015eb"). InnerVolumeSpecName "kube-api-access-jj9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.023220 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5025acca-f209-4f4c-ab4e-b7e386f5c3ab" path="/var/lib/kubelet/pods/5025acca-f209-4f4c-ab4e-b7e386f5c3ab/volumes" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.042062 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-config-data" (OuterVolumeSpecName: "config-data") pod "8907607c-d42a-49ac-9df0-9da2ceb015eb" (UID: "8907607c-d42a-49ac-9df0-9da2ceb015eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.056964 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8907607c-d42a-49ac-9df0-9da2ceb015eb" (UID: "8907607c-d42a-49ac-9df0-9da2ceb015eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.069246 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8907607c-d42a-49ac-9df0-9da2ceb015eb" (UID: "8907607c-d42a-49ac-9df0-9da2ceb015eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.092031 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8907607c-d42a-49ac-9df0-9da2ceb015eb" (UID: "8907607c-d42a-49ac-9df0-9da2ceb015eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.112731 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.112772 4865 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.112784 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj9sm\" (UniqueName: \"kubernetes.io/projected/8907607c-d42a-49ac-9df0-9da2ceb015eb-kube-api-access-jj9sm\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.112794 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.112805 4865 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8907607c-d42a-49ac-9df0-9da2ceb015eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.112814 4865 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8907607c-d42a-49ac-9df0-9da2ceb015eb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.292535 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fg748" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="registry-server" probeResult="failure" output=< Dec 05 06:15:23 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:15:23 crc kubenswrapper[4865]: > Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.464936 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.464928 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8907607c-d42a-49ac-9df0-9da2ceb015eb","Type":"ContainerDied","Data":"d2011d0cfe17a420d521709597ba7658d09d96736e06e9a230e5b6c0f00f1398"} Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.465165 4865 scope.go:117] "RemoveContainer" containerID="e537772632cb4790421fca55c1d4c51d9a4a6a36fc9af49da4f0e3ed4d7ed409" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.469163 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"668d173f-5e28-427e-a382-f905813fc91e","Type":"ContainerStarted","Data":"61924d2433e1e18ff3d751cd46d419b501daba9168299406d392309b9e1619ac"} Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.469214 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"668d173f-5e28-427e-a382-f905813fc91e","Type":"ContainerStarted","Data":"bd2a00c2299198fc36eb0c2c4a5c50e19ad945ac85fb37b6a5582ed039eb408a"} Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.469225 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"668d173f-5e28-427e-a382-f905813fc91e","Type":"ContainerStarted","Data":"104420065e546cd44a27e27067e3ba0cd1e85ceb84123049a5e988a0f5a13946"} Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.494729 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.494695822 podStartE2EDuration="2.494695822s" podCreationTimestamp="2025-12-05 06:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:15:23.488321737 +0000 UTC m=+1342.768332959" watchObservedRunningTime="2025-12-05 06:15:23.494695822 +0000 UTC m=+1342.774707044" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.509064 4865 scope.go:117] "RemoveContainer" containerID="64aa5cc6ad8fefad00a3910e973949f379f54bb823ca86c86598cf86018c52dd" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.519625 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.528483 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.547557 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:23 crc kubenswrapper[4865]: E1205 06:15:23.548002 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-log" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.548021 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-log" Dec 05 06:15:23 crc kubenswrapper[4865]: E1205 06:15:23.548042 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-api" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.548051 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-api" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.548238 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-log" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.548265 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" containerName="nova-api-api" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.549230 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.551329 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.555165 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.555396 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.562144 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.725263 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95b08d9c-9466-4aef-b330-160d014e1e9d-logs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.725586 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.725617 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8tp\" (UniqueName: \"kubernetes.io/projected/95b08d9c-9466-4aef-b330-160d014e1e9d-kube-api-access-xj8tp\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.725738 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.725761 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-config-data\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.725954 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.827708 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.828092 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95b08d9c-9466-4aef-b330-160d014e1e9d-logs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.828331 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.828459 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8tp\" (UniqueName: \"kubernetes.io/projected/95b08d9c-9466-4aef-b330-160d014e1e9d-kube-api-access-xj8tp\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.828574 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.828670 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-config-data\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.828846 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95b08d9c-9466-4aef-b330-160d014e1e9d-logs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.834112 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-config-data\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.836158 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.837996 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.843015 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b08d9c-9466-4aef-b330-160d014e1e9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.853390 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8tp\" (UniqueName: \"kubernetes.io/projected/95b08d9c-9466-4aef-b330-160d014e1e9d-kube-api-access-xj8tp\") pod \"nova-api-0\" (UID: \"95b08d9c-9466-4aef-b330-160d014e1e9d\") " pod="openstack/nova-api-0" Dec 05 06:15:23 crc kubenswrapper[4865]: I1205 06:15:23.899423 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 06:15:24 crc kubenswrapper[4865]: I1205 06:15:24.398600 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 06:15:24 crc kubenswrapper[4865]: I1205 06:15:24.485595 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95b08d9c-9466-4aef-b330-160d014e1e9d","Type":"ContainerStarted","Data":"bc4cbc92c83c2bf6dc37a5c4e9dd1cdcf102d16ae705f764e33257b6439632a7"} Dec 05 06:15:25 crc kubenswrapper[4865]: I1205 06:15:25.018164 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8907607c-d42a-49ac-9df0-9da2ceb015eb" path="/var/lib/kubelet/pods/8907607c-d42a-49ac-9df0-9da2ceb015eb/volumes" Dec 05 06:15:25 crc kubenswrapper[4865]: I1205 06:15:25.500380 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95b08d9c-9466-4aef-b330-160d014e1e9d","Type":"ContainerStarted","Data":"61cbab195bf4ce207f3febf2f17105019c6f8d4643930db1edac07342dfbdd8e"} Dec 05 06:15:25 crc kubenswrapper[4865]: I1205 06:15:25.500436 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95b08d9c-9466-4aef-b330-160d014e1e9d","Type":"ContainerStarted","Data":"c85aa94a79a72735df761fbf21e5a3aa8283fe25617197f9039e0ab23f7c212c"} Dec 05 06:15:25 crc kubenswrapper[4865]: I1205 06:15:25.527216 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.527194093 podStartE2EDuration="2.527194093s" podCreationTimestamp="2025-12-05 06:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:15:25.520171989 +0000 UTC m=+1344.800183431" watchObservedRunningTime="2025-12-05 06:15:25.527194093 +0000 UTC m=+1344.807205305" Dec 05 06:15:25 crc kubenswrapper[4865]: I1205 06:15:25.808064 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 06:15:26 crc kubenswrapper[4865]: I1205 06:15:26.983956 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 06:15:26 crc kubenswrapper[4865]: I1205 06:15:26.984247 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 06:15:30 crc kubenswrapper[4865]: I1205 06:15:30.808540 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 06:15:30 crc kubenswrapper[4865]: I1205 06:15:30.845249 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 06:15:31 crc kubenswrapper[4865]: I1205 06:15:31.611274 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 06:15:31 crc kubenswrapper[4865]: I1205 06:15:31.986563 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 06:15:31 crc kubenswrapper[4865]: I1205 06:15:31.986959 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 06:15:32 crc kubenswrapper[4865]: I1205 06:15:32.294988 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:32 crc kubenswrapper[4865]: I1205 06:15:32.354845 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:33 crc kubenswrapper[4865]: I1205 06:15:33.000910 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="668d173f-5e28-427e-a382-f905813fc91e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:15:33 crc kubenswrapper[4865]: I1205 06:15:33.000868 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="668d173f-5e28-427e-a382-f905813fc91e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:15:33 crc kubenswrapper[4865]: I1205 06:15:33.080371 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fg748"] Dec 05 06:15:33 crc kubenswrapper[4865]: I1205 06:15:33.591740 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fg748" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="registry-server" containerID="cri-o://8dae8cda329534a9e83b2b8284879e5c08ffce22890f9016b26706788000949c" gracePeriod=2 Dec 05 06:15:33 crc kubenswrapper[4865]: I1205 06:15:33.901134 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:15:33 crc kubenswrapper[4865]: I1205 06:15:33.901566 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 06:15:34 crc kubenswrapper[4865]: I1205 06:15:34.608363 4865 generic.go:334] "Generic (PLEG): container finished" podID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerID="8dae8cda329534a9e83b2b8284879e5c08ffce22890f9016b26706788000949c" exitCode=0 Dec 05 06:15:34 crc kubenswrapper[4865]: I1205 06:15:34.608427 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerDied","Data":"8dae8cda329534a9e83b2b8284879e5c08ffce22890f9016b26706788000949c"} Dec 05 06:15:34 crc kubenswrapper[4865]: I1205 06:15:34.764066 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 06:15:34 crc kubenswrapper[4865]: I1205 06:15:34.951056 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="95b08d9c-9466-4aef-b330-160d014e1e9d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:15:34 crc kubenswrapper[4865]: I1205 06:15:34.951106 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="95b08d9c-9466-4aef-b330-160d014e1e9d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.225287 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.616480 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.638120 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg748" event={"ID":"b2bc4e34-9781-48f0-84ac-3a7d3e583311","Type":"ContainerDied","Data":"1cc7ac23e350206ac090f6c1229214d9ef4c1b0889e82eab9cd9a8b82518d93c"} Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.638493 4865 scope.go:117] "RemoveContainer" containerID="8dae8cda329534a9e83b2b8284879e5c08ffce22890f9016b26706788000949c" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.638959 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg748" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.677482 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-utilities\") pod \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.680275 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-catalog-content\") pod \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.680428 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bckrg\" (UniqueName: \"kubernetes.io/projected/b2bc4e34-9781-48f0-84ac-3a7d3e583311-kube-api-access-bckrg\") pod \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\" (UID: \"b2bc4e34-9781-48f0-84ac-3a7d3e583311\") " Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.679326 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-utilities" (OuterVolumeSpecName: "utilities") pod "b2bc4e34-9781-48f0-84ac-3a7d3e583311" (UID: "b2bc4e34-9781-48f0-84ac-3a7d3e583311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.683859 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.685314 4865 scope.go:117] "RemoveContainer" containerID="98a77a37d33dc31198597d3dc86dda49255b55628bc835e6f9ff7981b01dc9f3" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.712088 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bc4e34-9781-48f0-84ac-3a7d3e583311-kube-api-access-bckrg" (OuterVolumeSpecName: "kube-api-access-bckrg") pod "b2bc4e34-9781-48f0-84ac-3a7d3e583311" (UID: "b2bc4e34-9781-48f0-84ac-3a7d3e583311"). InnerVolumeSpecName "kube-api-access-bckrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.761073 4865 scope.go:117] "RemoveContainer" containerID="9814487d2e8babeb60ecd2e3c6645f84fd96fac32da57300c556b365b8b345f4" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.785602 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bckrg\" (UniqueName: \"kubernetes.io/projected/b2bc4e34-9781-48f0-84ac-3a7d3e583311-kube-api-access-bckrg\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.804919 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2bc4e34-9781-48f0-84ac-3a7d3e583311" (UID: "b2bc4e34-9781-48f0-84ac-3a7d3e583311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.887342 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bc4e34-9781-48f0-84ac-3a7d3e583311-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.986403 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fg748"] Dec 05 06:15:35 crc kubenswrapper[4865]: I1205 06:15:35.995112 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fg748"] Dec 05 06:15:37 crc kubenswrapper[4865]: I1205 06:15:37.018520 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" path="/var/lib/kubelet/pods/b2bc4e34-9781-48f0-84ac-3a7d3e583311/volumes" Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.049494 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.050447 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.050521 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.051669 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39937e10b37729de9655b631fb05427006e716f9ab3edcd0d9c7edbbc9b5832a"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.051750 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://39937e10b37729de9655b631fb05427006e716f9ab3edcd0d9c7edbbc9b5832a" gracePeriod=600 Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.709311 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="39937e10b37729de9655b631fb05427006e716f9ab3edcd0d9c7edbbc9b5832a" exitCode=0 Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.709413 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"39937e10b37729de9655b631fb05427006e716f9ab3edcd0d9c7edbbc9b5832a"} Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.709640 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e"} Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.709666 4865 scope.go:117] "RemoveContainer" containerID="20536395e22903e5ca8dee5d63c34f131b2da1d0f9f86ec93b930a0c9e072342" Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.993926 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 06:15:41 crc kubenswrapper[4865]: I1205 06:15:41.995190 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.007461 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.730087 4865 generic.go:334] "Generic (PLEG): container finished" podID="c25abf96-27e6-4918-ac97-949d973cc542" containerID="89fca49ca21b8feac967a5eeffa7bb4c719b9c5501b5233893286f0ae5f9189f" exitCode=137 Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.730157 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerDied","Data":"89fca49ca21b8feac967a5eeffa7bb4c719b9c5501b5233893286f0ae5f9189f"} Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.731718 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c25abf96-27e6-4918-ac97-949d973cc542","Type":"ContainerDied","Data":"a2474c4315faca12a060f6ab8dac8141db329fd18a9f47d202c784576bfbf578"} Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.731737 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2474c4315faca12a060f6ab8dac8141db329fd18a9f47d202c784576bfbf578" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.739365 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.742505 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.843665 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-combined-ca-bundle\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.845514 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-sg-core-conf-yaml\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.845690 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-log-httpd\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.845883 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wcsp\" (UniqueName: \"kubernetes.io/projected/c25abf96-27e6-4918-ac97-949d973cc542-kube-api-access-8wcsp\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.846132 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-run-httpd\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.846308 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-scripts\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.846452 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.847129 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.846480 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-config-data\") pod \"c25abf96-27e6-4918-ac97-949d973cc542\" (UID: \"c25abf96-27e6-4918-ac97-949d973cc542\") " Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.853725 4865 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.853755 4865 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c25abf96-27e6-4918-ac97-949d973cc542-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.860746 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-scripts" (OuterVolumeSpecName: "scripts") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.860959 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25abf96-27e6-4918-ac97-949d973cc542-kube-api-access-8wcsp" (OuterVolumeSpecName: "kube-api-access-8wcsp") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "kube-api-access-8wcsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.904754 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.958460 4865 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.958742 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wcsp\" (UniqueName: \"kubernetes.io/projected/c25abf96-27e6-4918-ac97-949d973cc542-kube-api-access-8wcsp\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.958871 4865 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:42 crc kubenswrapper[4865]: I1205 06:15:42.994606 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.046243 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-config-data" (OuterVolumeSpecName: "config-data") pod "c25abf96-27e6-4918-ac97-949d973cc542" (UID: "c25abf96-27e6-4918-ac97-949d973cc542"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.061206 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.061402 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25abf96-27e6-4918-ac97-949d973cc542-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.742890 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.786159 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.799762 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.820690 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821235 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-notification-agent" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821259 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-notification-agent" Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821298 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="extract-content" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821309 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="extract-content" Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821347 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-central-agent" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821355 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-central-agent" Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821376 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="registry-server" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821387 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="registry-server" Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821418 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="extract-utilities" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821430 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="extract-utilities" Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821449 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="sg-core" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821460 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="sg-core" Dec 05 06:15:43 crc kubenswrapper[4865]: E1205 06:15:43.821478 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="proxy-httpd" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821489 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="proxy-httpd" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821735 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="proxy-httpd" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821762 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bc4e34-9781-48f0-84ac-3a7d3e583311" containerName="registry-server" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821775 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-central-agent" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821786 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="ceilometer-notification-agent" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.821811 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25abf96-27e6-4918-ac97-949d973cc542" containerName="sg-core" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.824277 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.834950 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.838377 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.838711 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.853297 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.876497 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/335ba680-a368-498b-8356-ef03d2c5cfb1-run-httpd\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.876751 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.876854 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/335ba680-a368-498b-8356-ef03d2c5cfb1-log-httpd\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.876946 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.877020 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-scripts\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.877089 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfzb\" (UniqueName: \"kubernetes.io/projected/335ba680-a368-498b-8356-ef03d2c5cfb1-kube-api-access-krfzb\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.877236 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.877326 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-config-data\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.909418 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.909766 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.910057 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.910562 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.915837 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.916382 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978712 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/335ba680-a368-498b-8356-ef03d2c5cfb1-log-httpd\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978789 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978812 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-scripts\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978844 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krfzb\" (UniqueName: \"kubernetes.io/projected/335ba680-a368-498b-8356-ef03d2c5cfb1-kube-api-access-krfzb\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978926 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978963 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-config-data\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.978994 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/335ba680-a368-498b-8356-ef03d2c5cfb1-run-httpd\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.979034 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.980274 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/335ba680-a368-498b-8356-ef03d2c5cfb1-log-httpd\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.980326 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/335ba680-a368-498b-8356-ef03d2c5cfb1-run-httpd\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.984889 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-scripts\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.987489 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:43 crc kubenswrapper[4865]: I1205 06:15:43.988709 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:44 crc kubenswrapper[4865]: I1205 06:15:44.000303 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:44 crc kubenswrapper[4865]: I1205 06:15:44.001646 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfzb\" (UniqueName: \"kubernetes.io/projected/335ba680-a368-498b-8356-ef03d2c5cfb1-kube-api-access-krfzb\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:44 crc kubenswrapper[4865]: I1205 06:15:44.004552 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/335ba680-a368-498b-8356-ef03d2c5cfb1-config-data\") pod \"ceilometer-0\" (UID: \"335ba680-a368-498b-8356-ef03d2c5cfb1\") " pod="openstack/ceilometer-0" Dec 05 06:15:44 crc kubenswrapper[4865]: I1205 06:15:44.157588 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 06:15:44 crc kubenswrapper[4865]: I1205 06:15:44.746740 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 06:15:44 crc kubenswrapper[4865]: W1205 06:15:44.756139 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod335ba680_a368_498b_8356_ef03d2c5cfb1.slice/crio-ff675e5d65694a721ca49274e3fc54260b7ea6ea7218cda965af7c39ef5c1fa9 WatchSource:0}: Error finding container ff675e5d65694a721ca49274e3fc54260b7ea6ea7218cda965af7c39ef5c1fa9: Status 404 returned error can't find the container with id ff675e5d65694a721ca49274e3fc54260b7ea6ea7218cda965af7c39ef5c1fa9 Dec 05 06:15:45 crc kubenswrapper[4865]: I1205 06:15:45.018887 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25abf96-27e6-4918-ac97-949d973cc542" path="/var/lib/kubelet/pods/c25abf96-27e6-4918-ac97-949d973cc542/volumes" Dec 05 06:15:45 crc kubenswrapper[4865]: I1205 06:15:45.764504 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"335ba680-a368-498b-8356-ef03d2c5cfb1","Type":"ContainerStarted","Data":"3f41798e07606905b4227121fa7783d7f6c8039a7fc8066603b1288a43288258"} Dec 05 06:15:45 crc kubenswrapper[4865]: I1205 06:15:45.764807 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"335ba680-a368-498b-8356-ef03d2c5cfb1","Type":"ContainerStarted","Data":"ff675e5d65694a721ca49274e3fc54260b7ea6ea7218cda965af7c39ef5c1fa9"} Dec 05 06:15:46 crc kubenswrapper[4865]: I1205 06:15:46.775054 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"335ba680-a368-498b-8356-ef03d2c5cfb1","Type":"ContainerStarted","Data":"47ce67e2a4e5971f92551158102862afbcb3b4080f376bb5d9d8f61baa6ec2de"} Dec 05 06:15:48 crc kubenswrapper[4865]: I1205 06:15:48.793984 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"335ba680-a368-498b-8356-ef03d2c5cfb1","Type":"ContainerStarted","Data":"e9574b10559773e6a789226cdcf8643a9f9c2f77c82e54e4caebc366a8627d75"} Dec 05 06:15:50 crc kubenswrapper[4865]: I1205 06:15:50.848338 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"335ba680-a368-498b-8356-ef03d2c5cfb1","Type":"ContainerStarted","Data":"57850a4af1fda515eec74531656706d7e193db98d28388097346bdb2528c5c57"} Dec 05 06:15:50 crc kubenswrapper[4865]: I1205 06:15:50.849632 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 06:15:50 crc kubenswrapper[4865]: I1205 06:15:50.891496 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.046858322 podStartE2EDuration="7.891473028s" podCreationTimestamp="2025-12-05 06:15:43 +0000 UTC" firstStartedPulling="2025-12-05 06:15:44.763704103 +0000 UTC m=+1364.043715315" lastFinishedPulling="2025-12-05 06:15:49.608318809 +0000 UTC m=+1368.888330021" observedRunningTime="2025-12-05 06:15:50.885632677 +0000 UTC m=+1370.165643899" watchObservedRunningTime="2025-12-05 06:15:50.891473028 +0000 UTC m=+1370.171484260" Dec 05 06:16:14 crc kubenswrapper[4865]: I1205 06:16:14.168459 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 06:16:23 crc kubenswrapper[4865]: I1205 06:16:23.687919 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:16:24 crc kubenswrapper[4865]: I1205 06:16:24.616576 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:16:28 crc kubenswrapper[4865]: I1205 06:16:28.607381 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerName="rabbitmq" containerID="cri-o://e4dc6db7c977e358c17d555fa585da5768e5c4183459af9eccb20e6a88adfaf9" gracePeriod=604796 Dec 05 06:16:29 crc kubenswrapper[4865]: I1205 06:16:29.196371 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerName="rabbitmq" containerID="cri-o://9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5" gracePeriod=604796 Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.074503 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-g65l8"] Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.084762 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.089132 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.094425 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-g65l8"] Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155274 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155337 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7j2q\" (UniqueName: \"kubernetes.io/projected/5b86488f-d09a-42b0-90f4-c9a188eab8bd-kube-api-access-g7j2q\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155417 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155475 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155528 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155557 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-config\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.155582 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257560 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257609 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-config\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257637 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257716 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257746 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7j2q\" (UniqueName: \"kubernetes.io/projected/5b86488f-d09a-42b0-90f4-c9a188eab8bd-kube-api-access-g7j2q\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257822 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.257899 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.259379 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-config\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.259401 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.259469 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.259476 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.259603 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.259789 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.283777 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7j2q\" (UniqueName: \"kubernetes.io/projected/5b86488f-d09a-42b0-90f4-c9a188eab8bd-kube-api-access-g7j2q\") pod \"dnsmasq-dns-5576978c7c-g65l8\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.348335 4865 generic.go:334] "Generic (PLEG): container finished" podID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerID="e4dc6db7c977e358c17d555fa585da5768e5c4183459af9eccb20e6a88adfaf9" exitCode=0 Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.348385 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3","Type":"ContainerDied","Data":"e4dc6db7c977e358c17d555fa585da5768e5c4183459af9eccb20e6a88adfaf9"} Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.348422 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3","Type":"ContainerDied","Data":"6b031c927d1ba7e31c906af03a3e3c5733431cd688c4399aad5b24d5a5cc6be2"} Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.348437 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b031c927d1ba7e31c906af03a3e3c5733431cd688c4399aad5b24d5a5cc6be2" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.428136 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.481157 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574525 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574635 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-erlang-cookie-secret\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574723 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-plugins-conf\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574855 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvw2t\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-kube-api-access-jvw2t\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574885 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-erlang-cookie\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574938 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-config-data\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.574964 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-tls\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.575000 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-server-conf\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.575029 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-confd\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.575066 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-plugins\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.575085 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-pod-info\") pod \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\" (UID: \"b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.578950 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.579197 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.579759 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.586712 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.594642 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.594684 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-kube-api-access-jvw2t" (OuterVolumeSpecName: "kube-api-access-jvw2t") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "kube-api-access-jvw2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.601140 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-pod-info" (OuterVolumeSpecName: "pod-info") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.627400 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680584 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680628 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680646 4865 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680683 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680699 4865 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680712 4865 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680728 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvw2t\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-kube-api-access-jvw2t\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.680743 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.710455 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.717743 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-config-data" (OuterVolumeSpecName: "config-data") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.780909 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-server-conf" (OuterVolumeSpecName: "server-conf") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.782430 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.782462 4865 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.782473 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.798517 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.812035 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" (UID: "b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.885462 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-config-data\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.885966 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d99nb\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-kube-api-access-d99nb\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.885992 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-erlang-cookie\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.886009 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-tls\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.886116 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4eacf2-62b6-48a0-9650-77e19a6db904-erlang-cookie-secret\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.886149 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-server-conf\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.886185 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.886216 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4eacf2-62b6-48a0-9650-77e19a6db904-pod-info\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.886233 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-confd\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.887904 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-plugins\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.887957 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-plugins-conf\") pod \"ff4eacf2-62b6-48a0-9650-77e19a6db904\" (UID: \"ff4eacf2-62b6-48a0-9650-77e19a6db904\") " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.892401 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.894280 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.894486 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.895231 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.896284 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.903169 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ff4eacf2-62b6-48a0-9650-77e19a6db904-pod-info" (OuterVolumeSpecName: "pod-info") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.916497 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.921114 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-kube-api-access-d99nb" (OuterVolumeSpecName: "kube-api-access-d99nb") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "kube-api-access-d99nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.921556 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4eacf2-62b6-48a0-9650-77e19a6db904-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.928511 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.965096 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-config-data" (OuterVolumeSpecName: "config-data") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.998992 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999026 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d99nb\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-kube-api-access-d99nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999039 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999049 4865 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4eacf2-62b6-48a0-9650-77e19a6db904-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999080 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999094 4865 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4eacf2-62b6-48a0-9650-77e19a6db904-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999106 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:35 crc kubenswrapper[4865]: I1205 06:16:35.999117 4865 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.044625 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-server-conf" (OuterVolumeSpecName: "server-conf") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.082667 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-g65l8"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.096380 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.101143 4865 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eacf2-62b6-48a0-9650-77e19a6db904-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.101218 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.165083 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ff4eacf2-62b6-48a0-9650-77e19a6db904" (UID: "ff4eacf2-62b6-48a0-9650-77e19a6db904"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.203339 4865 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4eacf2-62b6-48a0-9650-77e19a6db904-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.363010 4865 generic.go:334] "Generic (PLEG): container finished" podID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerID="9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5" exitCode=0 Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.363093 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff4eacf2-62b6-48a0-9650-77e19a6db904","Type":"ContainerDied","Data":"9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5"} Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.363135 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.363400 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff4eacf2-62b6-48a0-9650-77e19a6db904","Type":"ContainerDied","Data":"b95c6f356ce9612272cd648b356e6894566ef6dd498e1dfff75931f4fcf8fc82"} Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.363433 4865 scope.go:117] "RemoveContainer" containerID="9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.365960 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.368588 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" event={"ID":"5b86488f-d09a-42b0-90f4-c9a188eab8bd","Type":"ContainerStarted","Data":"def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159"} Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.368672 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" event={"ID":"5b86488f-d09a-42b0-90f4-c9a188eab8bd","Type":"ContainerStarted","Data":"4cfbb5852d828a894fd0f487cf245fb46f4e1da289c6dc1d5e7b5f23e57087e3"} Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.391766 4865 scope.go:117] "RemoveContainer" containerID="c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.433791 4865 scope.go:117] "RemoveContainer" containerID="9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5" Dec 05 06:16:36 crc kubenswrapper[4865]: E1205 06:16:36.434462 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5\": container with ID starting with 9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5 not found: ID does not exist" containerID="9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.434510 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5"} err="failed to get container status \"9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5\": rpc error: code = NotFound desc = could not find container \"9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5\": container with ID starting with 9ef3a7665eacae1fb28a763e4075c724f9dba94f76e6e8b58d9e619e65ee57b5 not found: ID does not exist" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.434544 4865 scope.go:117] "RemoveContainer" containerID="c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef" Dec 05 06:16:36 crc kubenswrapper[4865]: E1205 06:16:36.436379 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef\": container with ID starting with c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef not found: ID does not exist" containerID="c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.436419 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef"} err="failed to get container status \"c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef\": rpc error: code = NotFound desc = could not find container \"c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef\": container with ID starting with c7f574b73f3827d92d357368f5ca3d0ed0908143aa99fad5d4fc36c55180e6ef not found: ID does not exist" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.456094 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.468962 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.482814 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.492524 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.503928 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: E1205 06:16:36.504507 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerName="rabbitmq" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.504530 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerName="rabbitmq" Dec 05 06:16:36 crc kubenswrapper[4865]: E1205 06:16:36.504546 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerName="setup-container" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.504554 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerName="setup-container" Dec 05 06:16:36 crc kubenswrapper[4865]: E1205 06:16:36.504570 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerName="rabbitmq" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.504578 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerName="rabbitmq" Dec 05 06:16:36 crc kubenswrapper[4865]: E1205 06:16:36.504594 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerName="setup-container" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.504601 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerName="setup-container" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.504803 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" containerName="rabbitmq" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.504840 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" containerName="rabbitmq" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.505892 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.513732 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.514744 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.514898 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.515446 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.516513 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.517567 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b2hp7" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.518090 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.567654 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.609802 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.627245 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.632367 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.632601 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.632880 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ds6zh" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.633199 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.633339 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.633475 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.633709 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.658346 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.714921 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.715007 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.715525 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.715770 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.715807 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.715943 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d853a9c1-f9c9-412e-91bb-9f87123db63d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.715989 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-config-data\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.716076 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdl8\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-kube-api-access-pvdl8\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.716108 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.716131 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.716220 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d853a9c1-f9c9-412e-91bb-9f87123db63d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.817965 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818063 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818160 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818195 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr6v8\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-kube-api-access-fr6v8\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818268 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818303 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818358 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818404 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818440 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9197b580-1cf6-4939-abfd-8dcac6a5df7e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818479 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818516 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d853a9c1-f9c9-412e-91bb-9f87123db63d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818559 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-config-data\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818600 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818631 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818678 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818721 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdl8\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-kube-api-access-pvdl8\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818766 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818881 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.818957 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.819082 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d853a9c1-f9c9-412e-91bb-9f87123db63d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.819149 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.819173 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9197b580-1cf6-4939-abfd-8dcac6a5df7e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.819761 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.820043 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.820313 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.820990 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.821698 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.823291 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d853a9c1-f9c9-412e-91bb-9f87123db63d-config-data\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.824501 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.827508 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.829317 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d853a9c1-f9c9-412e-91bb-9f87123db63d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.839791 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d853a9c1-f9c9-412e-91bb-9f87123db63d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.848421 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdl8\" (UniqueName: \"kubernetes.io/projected/d853a9c1-f9c9-412e-91bb-9f87123db63d-kube-api-access-pvdl8\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.878559 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d853a9c1-f9c9-412e-91bb-9f87123db63d\") " pod="openstack/rabbitmq-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921572 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921651 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921680 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9197b580-1cf6-4939-abfd-8dcac6a5df7e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921713 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921758 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921780 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921811 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921869 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.921959 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9197b580-1cf6-4939-abfd-8dcac6a5df7e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.922010 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.922031 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr6v8\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-kube-api-access-fr6v8\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.923060 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.923094 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.928499 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.928787 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.930183 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9197b580-1cf6-4939-abfd-8dcac6a5df7e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.930407 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.934419 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.940503 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9197b580-1cf6-4939-abfd-8dcac6a5df7e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.940984 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.941115 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr6v8\" (UniqueName: \"kubernetes.io/projected/9197b580-1cf6-4939-abfd-8dcac6a5df7e-kube-api-access-fr6v8\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.942658 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9197b580-1cf6-4939-abfd-8dcac6a5df7e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:36 crc kubenswrapper[4865]: I1205 06:16:36.960766 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9197b580-1cf6-4939-abfd-8dcac6a5df7e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.021548 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3" path="/var/lib/kubelet/pods/b655d6e9-628e-4fbb-a0c4-fc46a71b9ff3/volumes" Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.022897 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4eacf2-62b6-48a0-9650-77e19a6db904" path="/var/lib/kubelet/pods/ff4eacf2-62b6-48a0-9650-77e19a6db904/volumes" Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.138035 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.257299 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.392046 4865 generic.go:334] "Generic (PLEG): container finished" podID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerID="def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159" exitCode=0 Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.392576 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" event={"ID":"5b86488f-d09a-42b0-90f4-c9a188eab8bd","Type":"ContainerDied","Data":"def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159"} Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.474862 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 06:16:37 crc kubenswrapper[4865]: I1205 06:16:37.784194 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 06:16:37 crc kubenswrapper[4865]: W1205 06:16:37.793191 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9197b580_1cf6_4939_abfd_8dcac6a5df7e.slice/crio-4b98b52f789e31dfd29c43565b44950ff8f36d8f253674857de8b8d0c69b1c32 WatchSource:0}: Error finding container 4b98b52f789e31dfd29c43565b44950ff8f36d8f253674857de8b8d0c69b1c32: Status 404 returned error can't find the container with id 4b98b52f789e31dfd29c43565b44950ff8f36d8f253674857de8b8d0c69b1c32 Dec 05 06:16:38 crc kubenswrapper[4865]: I1205 06:16:38.413727 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9197b580-1cf6-4939-abfd-8dcac6a5df7e","Type":"ContainerStarted","Data":"4b98b52f789e31dfd29c43565b44950ff8f36d8f253674857de8b8d0c69b1c32"} Dec 05 06:16:38 crc kubenswrapper[4865]: I1205 06:16:38.417361 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" event={"ID":"5b86488f-d09a-42b0-90f4-c9a188eab8bd","Type":"ContainerStarted","Data":"9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c"} Dec 05 06:16:38 crc kubenswrapper[4865]: I1205 06:16:38.418110 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:38 crc kubenswrapper[4865]: I1205 06:16:38.419044 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d853a9c1-f9c9-412e-91bb-9f87123db63d","Type":"ContainerStarted","Data":"cb5b4a21a0b00d8957b250e9c32f6d1843fe3b53ee3af5bf9ac8e8a5bc603e14"} Dec 05 06:16:38 crc kubenswrapper[4865]: I1205 06:16:38.451807 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" podStartSLOduration=3.451785236 podStartE2EDuration="3.451785236s" podCreationTimestamp="2025-12-05 06:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:16:38.448557807 +0000 UTC m=+1417.728569049" watchObservedRunningTime="2025-12-05 06:16:38.451785236 +0000 UTC m=+1417.731796458" Dec 05 06:16:40 crc kubenswrapper[4865]: I1205 06:16:40.445280 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9197b580-1cf6-4939-abfd-8dcac6a5df7e","Type":"ContainerStarted","Data":"4694ebef900337306b025902921e82d5679dd60eab897ae69af4758eb5c55b6e"} Dec 05 06:16:40 crc kubenswrapper[4865]: I1205 06:16:40.449981 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d853a9c1-f9c9-412e-91bb-9f87123db63d","Type":"ContainerStarted","Data":"feb3ee8cbd2ae4baa3f7bc4553e64315d76d7e90504ffc141f452c95919508b1"} Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.431205 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.511217 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fl7gw"] Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.511455 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerName="dnsmasq-dns" containerID="cri-o://52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489" gracePeriod=10 Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.709424 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fb7f8d4c-hp4ck"] Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.710979 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.725192 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fb7f8d4c-hp4ck"] Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.813944 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-dns-svc\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.814225 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-ovsdbserver-nb\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.814340 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-ovsdbserver-sb\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.814514 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-dns-swift-storage-0\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.814650 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-openstack-edpm-ipam\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.814712 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-config\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.814741 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5s56\" (UniqueName: \"kubernetes.io/projected/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-kube-api-access-j5s56\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916114 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-openstack-edpm-ipam\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916165 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-config\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916186 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5s56\" (UniqueName: \"kubernetes.io/projected/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-kube-api-access-j5s56\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916231 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-dns-svc\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916264 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-ovsdbserver-nb\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916303 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-ovsdbserver-sb\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.916350 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-dns-swift-storage-0\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.917160 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-dns-swift-storage-0\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.918175 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-dns-svc\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.918234 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-ovsdbserver-nb\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.918424 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-ovsdbserver-sb\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.918682 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-openstack-edpm-ipam\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.918985 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-config\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:45 crc kubenswrapper[4865]: I1205 06:16:45.940652 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5s56\" (UniqueName: \"kubernetes.io/projected/1e9a22c2-0e4d-4c25-b694-e3afc4721e58-kube-api-access-j5s56\") pod \"dnsmasq-dns-55fb7f8d4c-hp4ck\" (UID: \"1e9a22c2-0e4d-4c25-b694-e3afc4721e58\") " pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.040171 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.047416 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.120572 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-sb\") pod \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.120877 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj4vj\" (UniqueName: \"kubernetes.io/projected/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-kube-api-access-sj4vj\") pod \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.120967 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-config\") pod \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.121109 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-svc\") pod \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.121137 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-swift-storage-0\") pod \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.121264 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-nb\") pod \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\" (UID: \"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f\") " Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.189083 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-kube-api-access-sj4vj" (OuterVolumeSpecName: "kube-api-access-sj4vj") pod "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" (UID: "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f"). InnerVolumeSpecName "kube-api-access-sj4vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.216814 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" (UID: "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.223059 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" (UID: "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.225362 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.225384 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.225398 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj4vj\" (UniqueName: \"kubernetes.io/projected/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-kube-api-access-sj4vj\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.232300 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-config" (OuterVolumeSpecName: "config") pod "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" (UID: "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.237522 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" (UID: "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.267844 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" (UID: "36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.326918 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.326947 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.326958 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.525065 4865 generic.go:334] "Generic (PLEG): container finished" podID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerID="52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489" exitCode=0 Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.525102 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" event={"ID":"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f","Type":"ContainerDied","Data":"52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489"} Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.525129 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" event={"ID":"36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f","Type":"ContainerDied","Data":"ac0c5d344f2ca920ba6b4eb22b0ff85c49ccfbd5c640e6557ec4ae3744388dfc"} Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.525149 4865 scope.go:117] "RemoveContainer" containerID="52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.525276 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fl7gw" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.568545 4865 scope.go:117] "RemoveContainer" containerID="19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.601123 4865 scope.go:117] "RemoveContainer" containerID="52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489" Dec 05 06:16:46 crc kubenswrapper[4865]: E1205 06:16:46.601941 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489\": container with ID starting with 52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489 not found: ID does not exist" containerID="52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.602099 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489"} err="failed to get container status \"52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489\": rpc error: code = NotFound desc = could not find container \"52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489\": container with ID starting with 52ebbf0680e2ca6f273e96976bfe72f482b1adab037c62d4d0d998abf2e0f489 not found: ID does not exist" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.602196 4865 scope.go:117] "RemoveContainer" containerID="19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821" Dec 05 06:16:46 crc kubenswrapper[4865]: E1205 06:16:46.602984 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821\": container with ID starting with 19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821 not found: ID does not exist" containerID="19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.603076 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821"} err="failed to get container status \"19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821\": rpc error: code = NotFound desc = could not find container \"19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821\": container with ID starting with 19f745284ac0e46ccac72ab5e30bce50daf9e7437d8128939d316e8f14376821 not found: ID does not exist" Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.612283 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fb7f8d4c-hp4ck"] Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.624098 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fl7gw"] Dec 05 06:16:46 crc kubenswrapper[4865]: I1205 06:16:46.640108 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fl7gw"] Dec 05 06:16:47 crc kubenswrapper[4865]: I1205 06:16:47.018630 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" path="/var/lib/kubelet/pods/36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f/volumes" Dec 05 06:16:47 crc kubenswrapper[4865]: I1205 06:16:47.538601 4865 generic.go:334] "Generic (PLEG): container finished" podID="1e9a22c2-0e4d-4c25-b694-e3afc4721e58" containerID="52990583299ac9160fd1f9c49df88985e1a0120ce05d51be5c7bf07c288b5ff3" exitCode=0 Dec 05 06:16:47 crc kubenswrapper[4865]: I1205 06:16:47.538678 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" event={"ID":"1e9a22c2-0e4d-4c25-b694-e3afc4721e58","Type":"ContainerDied","Data":"52990583299ac9160fd1f9c49df88985e1a0120ce05d51be5c7bf07c288b5ff3"} Dec 05 06:16:47 crc kubenswrapper[4865]: I1205 06:16:47.539007 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" event={"ID":"1e9a22c2-0e4d-4c25-b694-e3afc4721e58","Type":"ContainerStarted","Data":"17fffd9baaa500ad876cf89a10efa38b97c05b200511e01b10d38ac85b70d183"} Dec 05 06:16:48 crc kubenswrapper[4865]: I1205 06:16:48.559140 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" event={"ID":"1e9a22c2-0e4d-4c25-b694-e3afc4721e58","Type":"ContainerStarted","Data":"0c6381c75eeed736a66b8659926aa3c2a244868c3b4160e9fedaed325a12b9bf"} Dec 05 06:16:48 crc kubenswrapper[4865]: I1205 06:16:48.559698 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:48 crc kubenswrapper[4865]: I1205 06:16:48.597513 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" podStartSLOduration=3.597479904 podStartE2EDuration="3.597479904s" podCreationTimestamp="2025-12-05 06:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:16:48.591747506 +0000 UTC m=+1427.871758758" watchObservedRunningTime="2025-12-05 06:16:48.597479904 +0000 UTC m=+1427.877491156" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.042013 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55fb7f8d4c-hp4ck" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.152404 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-g65l8"] Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.152656 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerName="dnsmasq-dns" containerID="cri-o://9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c" gracePeriod=10 Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.630655 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.651602 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" event={"ID":"5b86488f-d09a-42b0-90f4-c9a188eab8bd","Type":"ContainerDied","Data":"9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c"} Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.651649 4865 scope.go:117] "RemoveContainer" containerID="9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.651611 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.651559 4865 generic.go:334] "Generic (PLEG): container finished" podID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerID="9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c" exitCode=0 Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.651849 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-g65l8" event={"ID":"5b86488f-d09a-42b0-90f4-c9a188eab8bd","Type":"ContainerDied","Data":"4cfbb5852d828a894fd0f487cf245fb46f4e1da289c6dc1d5e7b5f23e57087e3"} Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.702578 4865 scope.go:117] "RemoveContainer" containerID="def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.729472 4865 scope.go:117] "RemoveContainer" containerID="9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c" Dec 05 06:16:56 crc kubenswrapper[4865]: E1205 06:16:56.729960 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c\": container with ID starting with 9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c not found: ID does not exist" containerID="9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.730012 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c"} err="failed to get container status \"9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c\": rpc error: code = NotFound desc = could not find container \"9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c\": container with ID starting with 9b0486c5f82e125cc19e6ecb6074234240d5d2cf4def780230f7528744a4203c not found: ID does not exist" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.730045 4865 scope.go:117] "RemoveContainer" containerID="def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159" Dec 05 06:16:56 crc kubenswrapper[4865]: E1205 06:16:56.730489 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159\": container with ID starting with def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159 not found: ID does not exist" containerID="def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.730508 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159"} err="failed to get container status \"def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159\": rpc error: code = NotFound desc = could not find container \"def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159\": container with ID starting with def1443869b7fddd6b37a7b9a6f084fd439131aed86e2d610e200e88396d1159 not found: ID does not exist" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.788769 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7j2q\" (UniqueName: \"kubernetes.io/projected/5b86488f-d09a-42b0-90f4-c9a188eab8bd-kube-api-access-g7j2q\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.788950 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-nb\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.788998 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-config\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.789064 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-swift-storage-0\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.789098 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-svc\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.789171 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-openstack-edpm-ipam\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.789199 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-sb\") pod \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\" (UID: \"5b86488f-d09a-42b0-90f4-c9a188eab8bd\") " Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.795387 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b86488f-d09a-42b0-90f4-c9a188eab8bd-kube-api-access-g7j2q" (OuterVolumeSpecName: "kube-api-access-g7j2q") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "kube-api-access-g7j2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.852667 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.856288 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-config" (OuterVolumeSpecName: "config") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.876069 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.880403 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.890789 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.891896 4865 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.891921 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.891935 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7j2q\" (UniqueName: \"kubernetes.io/projected/5b86488f-d09a-42b0-90f4-c9a188eab8bd-kube-api-access-g7j2q\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.891945 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.891954 4865 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-config\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.891962 4865 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.905369 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b86488f-d09a-42b0-90f4-c9a188eab8bd" (UID: "5b86488f-d09a-42b0-90f4-c9a188eab8bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.983007 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-g65l8"] Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.993157 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-g65l8"] Dec 05 06:16:56 crc kubenswrapper[4865]: I1205 06:16:56.993717 4865 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b86488f-d09a-42b0-90f4-c9a188eab8bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 06:16:57 crc kubenswrapper[4865]: I1205 06:16:57.016601 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" path="/var/lib/kubelet/pods/5b86488f-d09a-42b0-90f4-c9a188eab8bd/volumes" Dec 05 06:17:12 crc kubenswrapper[4865]: I1205 06:17:12.829568 4865 generic.go:334] "Generic (PLEG): container finished" podID="9197b580-1cf6-4939-abfd-8dcac6a5df7e" containerID="4694ebef900337306b025902921e82d5679dd60eab897ae69af4758eb5c55b6e" exitCode=0 Dec 05 06:17:12 crc kubenswrapper[4865]: I1205 06:17:12.829685 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9197b580-1cf6-4939-abfd-8dcac6a5df7e","Type":"ContainerDied","Data":"4694ebef900337306b025902921e82d5679dd60eab897ae69af4758eb5c55b6e"} Dec 05 06:17:12 crc kubenswrapper[4865]: I1205 06:17:12.835800 4865 generic.go:334] "Generic (PLEG): container finished" podID="d853a9c1-f9c9-412e-91bb-9f87123db63d" containerID="feb3ee8cbd2ae4baa3f7bc4553e64315d76d7e90504ffc141f452c95919508b1" exitCode=0 Dec 05 06:17:12 crc kubenswrapper[4865]: I1205 06:17:12.835867 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d853a9c1-f9c9-412e-91bb-9f87123db63d","Type":"ContainerDied","Data":"feb3ee8cbd2ae4baa3f7bc4553e64315d76d7e90504ffc141f452c95919508b1"} Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.722059 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6h76x"] Dec 05 06:17:13 crc kubenswrapper[4865]: E1205 06:17:13.722853 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerName="dnsmasq-dns" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.722867 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerName="dnsmasq-dns" Dec 05 06:17:13 crc kubenswrapper[4865]: E1205 06:17:13.722898 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerName="init" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.722906 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerName="init" Dec 05 06:17:13 crc kubenswrapper[4865]: E1205 06:17:13.722917 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerName="init" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.722926 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerName="init" Dec 05 06:17:13 crc kubenswrapper[4865]: E1205 06:17:13.722943 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerName="dnsmasq-dns" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.722949 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerName="dnsmasq-dns" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.723129 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="36008d4f-e5dd-4cb6-9dc3-cd577bc48a5f" containerName="dnsmasq-dns" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.723149 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b86488f-d09a-42b0-90f4-c9a188eab8bd" containerName="dnsmasq-dns" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.724625 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.739904 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6h76x"] Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.743501 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-utilities\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.743561 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-catalog-content\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.743605 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s5wn\" (UniqueName: \"kubernetes.io/projected/250a94d2-9568-419a-8b44-8ec1ab965eca-kube-api-access-6s5wn\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.847724 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-utilities\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.847793 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-catalog-content\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.847863 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s5wn\" (UniqueName: \"kubernetes.io/projected/250a94d2-9568-419a-8b44-8ec1ab965eca-kube-api-access-6s5wn\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.848589 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-catalog-content\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.848729 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-utilities\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.855349 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9197b580-1cf6-4939-abfd-8dcac6a5df7e","Type":"ContainerStarted","Data":"be91007e39897f53ed1736bf8e95c286953316f0508660e696ecfd785e08708b"} Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.857396 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.859368 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d853a9c1-f9c9-412e-91bb-9f87123db63d","Type":"ContainerStarted","Data":"6c6b073d248f060bf454e1bbb45dddbb8fd160d0b7eefd48e4702cde80c4c934"} Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.859786 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.870566 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s5wn\" (UniqueName: \"kubernetes.io/projected/250a94d2-9568-419a-8b44-8ec1ab965eca-kube-api-access-6s5wn\") pod \"certified-operators-6h76x\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.892895 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.892874277 podStartE2EDuration="37.892874277s" podCreationTimestamp="2025-12-05 06:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:17:13.886738582 +0000 UTC m=+1453.166749824" watchObservedRunningTime="2025-12-05 06:17:13.892874277 +0000 UTC m=+1453.172885499" Dec 05 06:17:13 crc kubenswrapper[4865]: I1205 06:17:13.921342 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.921320803 podStartE2EDuration="37.921320803s" podCreationTimestamp="2025-12-05 06:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 06:17:13.917407838 +0000 UTC m=+1453.197419060" watchObservedRunningTime="2025-12-05 06:17:13.921320803 +0000 UTC m=+1453.201332025" Dec 05 06:17:14 crc kubenswrapper[4865]: I1205 06:17:14.096509 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:14 crc kubenswrapper[4865]: I1205 06:17:14.738920 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6h76x"] Dec 05 06:17:14 crc kubenswrapper[4865]: I1205 06:17:14.894943 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerStarted","Data":"82d48891f5b620148133b93a088b259e6f6836d1e72d5054298f1ad9a01019fe"} Dec 05 06:17:15 crc kubenswrapper[4865]: I1205 06:17:15.904160 4865 generic.go:334] "Generic (PLEG): container finished" podID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerID="dce67d3c079f998dcabbd71971779c94df0e3c38ade7c895efaa421ee91065c9" exitCode=0 Dec 05 06:17:15 crc kubenswrapper[4865]: I1205 06:17:15.904443 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerDied","Data":"dce67d3c079f998dcabbd71971779c94df0e3c38ade7c895efaa421ee91065c9"} Dec 05 06:17:16 crc kubenswrapper[4865]: I1205 06:17:16.924071 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerStarted","Data":"a912cec2a22492cd952ee2e2b2e213a7e5d45f578752ae2b10f06d8ef4876d3d"} Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.534210 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj"] Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.535665 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: W1205 06:17:17.538715 4865 reflector.go:561] object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b": failed to list *v1.Secret: secrets "openstack-edpm-ipam-dockercfg-gtc4b" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 05 06:17:17 crc kubenswrapper[4865]: E1205 06:17:17.538755 4865 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-gtc4b\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-edpm-ipam-dockercfg-gtc4b\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.538834 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.538909 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.543874 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.562916 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj"] Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.642733 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.642847 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.642908 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc24m\" (UniqueName: \"kubernetes.io/projected/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-kube-api-access-wc24m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.643023 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.745299 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.745445 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.745506 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc24m\" (UniqueName: \"kubernetes.io/projected/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-kube-api-access-wc24m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.745586 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.755381 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.755805 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.766098 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc24m\" (UniqueName: \"kubernetes.io/projected/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-kube-api-access-wc24m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:17 crc kubenswrapper[4865]: I1205 06:17:17.772387 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:18 crc kubenswrapper[4865]: I1205 06:17:18.768358 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:17:18 crc kubenswrapper[4865]: I1205 06:17:18.774234 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:18 crc kubenswrapper[4865]: I1205 06:17:18.987174 4865 generic.go:334] "Generic (PLEG): container finished" podID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerID="a912cec2a22492cd952ee2e2b2e213a7e5d45f578752ae2b10f06d8ef4876d3d" exitCode=0 Dec 05 06:17:18 crc kubenswrapper[4865]: I1205 06:17:18.987248 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerDied","Data":"a912cec2a22492cd952ee2e2b2e213a7e5d45f578752ae2b10f06d8ef4876d3d"} Dec 05 06:17:19 crc kubenswrapper[4865]: I1205 06:17:19.798524 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj"] Dec 05 06:17:20 crc kubenswrapper[4865]: I1205 06:17:20.059689 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerStarted","Data":"9ed11a0dcf4bdf9ac34ae3e43b95f52af099cec1ebe0c67f6207dce150ecaacf"} Dec 05 06:17:20 crc kubenswrapper[4865]: I1205 06:17:20.065204 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" event={"ID":"644fb5cf-0fad-4825-9975-46e8c5f3e1ec","Type":"ContainerStarted","Data":"6ecccfb4b3a6c5980967adbd341ec0f92a8a66f924dc3e48e1a22245a33a43cb"} Dec 05 06:17:20 crc kubenswrapper[4865]: I1205 06:17:20.087781 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6h76x" podStartSLOduration=3.556378941 podStartE2EDuration="7.087760396s" podCreationTimestamp="2025-12-05 06:17:13 +0000 UTC" firstStartedPulling="2025-12-05 06:17:15.905616367 +0000 UTC m=+1455.185627599" lastFinishedPulling="2025-12-05 06:17:19.436997832 +0000 UTC m=+1458.717009054" observedRunningTime="2025-12-05 06:17:20.085811173 +0000 UTC m=+1459.365822395" watchObservedRunningTime="2025-12-05 06:17:20.087760396 +0000 UTC m=+1459.367771618" Dec 05 06:17:24 crc kubenswrapper[4865]: I1205 06:17:24.097383 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:24 crc kubenswrapper[4865]: I1205 06:17:24.097934 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:25 crc kubenswrapper[4865]: I1205 06:17:25.167093 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6h76x" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="registry-server" probeResult="failure" output=< Dec 05 06:17:25 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:17:25 crc kubenswrapper[4865]: > Dec 05 06:17:27 crc kubenswrapper[4865]: I1205 06:17:27.142027 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 06:17:27 crc kubenswrapper[4865]: I1205 06:17:27.265187 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 06:17:30 crc kubenswrapper[4865]: I1205 06:17:30.628935 4865 scope.go:117] "RemoveContainer" containerID="49d67739f31cafcc87fc6c330cb79fbdef086b770232befb583085154eba0839" Dec 05 06:17:34 crc kubenswrapper[4865]: I1205 06:17:34.164010 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:34 crc kubenswrapper[4865]: I1205 06:17:34.239010 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:34 crc kubenswrapper[4865]: I1205 06:17:34.400201 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6h76x"] Dec 05 06:17:35 crc kubenswrapper[4865]: I1205 06:17:35.243298 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6h76x" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="registry-server" containerID="cri-o://9ed11a0dcf4bdf9ac34ae3e43b95f52af099cec1ebe0c67f6207dce150ecaacf" gracePeriod=2 Dec 05 06:17:35 crc kubenswrapper[4865]: E1205 06:17:35.489958 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250a94d2_9568_419a_8b44_8ec1ab965eca.slice/crio-9ed11a0dcf4bdf9ac34ae3e43b95f52af099cec1ebe0c67f6207dce150ecaacf.scope\": RecentStats: unable to find data in memory cache]" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.219804 4865 scope.go:117] "RemoveContainer" containerID="e4dc6db7c977e358c17d555fa585da5768e5c4183459af9eccb20e6a88adfaf9" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.262179 4865 generic.go:334] "Generic (PLEG): container finished" podID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerID="9ed11a0dcf4bdf9ac34ae3e43b95f52af099cec1ebe0c67f6207dce150ecaacf" exitCode=0 Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.262222 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerDied","Data":"9ed11a0dcf4bdf9ac34ae3e43b95f52af099cec1ebe0c67f6207dce150ecaacf"} Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.310495 4865 scope.go:117] "RemoveContainer" containerID="d8a5c42fa71fcb839cb32be74a4860d9c055f794d10b51607b3f1c0c43ce20d6" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.628412 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.762398 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-utilities\") pod \"250a94d2-9568-419a-8b44-8ec1ab965eca\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.762615 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s5wn\" (UniqueName: \"kubernetes.io/projected/250a94d2-9568-419a-8b44-8ec1ab965eca-kube-api-access-6s5wn\") pod \"250a94d2-9568-419a-8b44-8ec1ab965eca\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.762758 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-catalog-content\") pod \"250a94d2-9568-419a-8b44-8ec1ab965eca\" (UID: \"250a94d2-9568-419a-8b44-8ec1ab965eca\") " Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.763581 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-utilities" (OuterVolumeSpecName: "utilities") pod "250a94d2-9568-419a-8b44-8ec1ab965eca" (UID: "250a94d2-9568-419a-8b44-8ec1ab965eca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.780754 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250a94d2-9568-419a-8b44-8ec1ab965eca-kube-api-access-6s5wn" (OuterVolumeSpecName: "kube-api-access-6s5wn") pod "250a94d2-9568-419a-8b44-8ec1ab965eca" (UID: "250a94d2-9568-419a-8b44-8ec1ab965eca"). InnerVolumeSpecName "kube-api-access-6s5wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.851444 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "250a94d2-9568-419a-8b44-8ec1ab965eca" (UID: "250a94d2-9568-419a-8b44-8ec1ab965eca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.865324 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s5wn\" (UniqueName: \"kubernetes.io/projected/250a94d2-9568-419a-8b44-8ec1ab965eca-kube-api-access-6s5wn\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.865910 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:36 crc kubenswrapper[4865]: I1205 06:17:36.865926 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a94d2-9568-419a-8b44-8ec1ab965eca-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.276756 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" event={"ID":"644fb5cf-0fad-4825-9975-46e8c5f3e1ec","Type":"ContainerStarted","Data":"39ef0fba9bfdcafea0361fb1189c2bb1d3948c847d105cc1b2b729922f238f5c"} Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.284631 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h76x" event={"ID":"250a94d2-9568-419a-8b44-8ec1ab965eca","Type":"ContainerDied","Data":"82d48891f5b620148133b93a088b259e6f6836d1e72d5054298f1ad9a01019fe"} Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.284891 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h76x" Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.284949 4865 scope.go:117] "RemoveContainer" containerID="9ed11a0dcf4bdf9ac34ae3e43b95f52af099cec1ebe0c67f6207dce150ecaacf" Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.300855 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" podStartSLOduration=3.798532576 podStartE2EDuration="20.300812905s" podCreationTimestamp="2025-12-05 06:17:17 +0000 UTC" firstStartedPulling="2025-12-05 06:17:19.809069131 +0000 UTC m=+1459.089080353" lastFinishedPulling="2025-12-05 06:17:36.31134946 +0000 UTC m=+1475.591360682" observedRunningTime="2025-12-05 06:17:37.300393344 +0000 UTC m=+1476.580404566" watchObservedRunningTime="2025-12-05 06:17:37.300812905 +0000 UTC m=+1476.580824127" Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.327512 4865 scope.go:117] "RemoveContainer" containerID="a912cec2a22492cd952ee2e2b2e213a7e5d45f578752ae2b10f06d8ef4876d3d" Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.355156 4865 scope.go:117] "RemoveContainer" containerID="dce67d3c079f998dcabbd71971779c94df0e3c38ade7c895efaa421ee91065c9" Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.362570 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6h76x"] Dec 05 06:17:37 crc kubenswrapper[4865]: I1205 06:17:37.372154 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6h76x"] Dec 05 06:17:39 crc kubenswrapper[4865]: I1205 06:17:39.020689 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" path="/var/lib/kubelet/pods/250a94d2-9568-419a-8b44-8ec1ab965eca/volumes" Dec 05 06:17:41 crc kubenswrapper[4865]: I1205 06:17:41.049099 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:17:41 crc kubenswrapper[4865]: I1205 06:17:41.049388 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:17:54 crc kubenswrapper[4865]: I1205 06:17:54.459766 4865 generic.go:334] "Generic (PLEG): container finished" podID="644fb5cf-0fad-4825-9975-46e8c5f3e1ec" containerID="39ef0fba9bfdcafea0361fb1189c2bb1d3948c847d105cc1b2b729922f238f5c" exitCode=0 Dec 05 06:17:54 crc kubenswrapper[4865]: I1205 06:17:54.459877 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" event={"ID":"644fb5cf-0fad-4825-9975-46e8c5f3e1ec","Type":"ContainerDied","Data":"39ef0fba9bfdcafea0361fb1189c2bb1d3948c847d105cc1b2b729922f238f5c"} Dec 05 06:17:55 crc kubenswrapper[4865]: I1205 06:17:55.946344 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.075580 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-ssh-key\") pod \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.076046 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-inventory\") pod \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.076428 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-repo-setup-combined-ca-bundle\") pod \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.076527 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc24m\" (UniqueName: \"kubernetes.io/projected/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-kube-api-access-wc24m\") pod \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\" (UID: \"644fb5cf-0fad-4825-9975-46e8c5f3e1ec\") " Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.085759 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-kube-api-access-wc24m" (OuterVolumeSpecName: "kube-api-access-wc24m") pod "644fb5cf-0fad-4825-9975-46e8c5f3e1ec" (UID: "644fb5cf-0fad-4825-9975-46e8c5f3e1ec"). InnerVolumeSpecName "kube-api-access-wc24m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.098162 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "644fb5cf-0fad-4825-9975-46e8c5f3e1ec" (UID: "644fb5cf-0fad-4825-9975-46e8c5f3e1ec"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.109645 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-inventory" (OuterVolumeSpecName: "inventory") pod "644fb5cf-0fad-4825-9975-46e8c5f3e1ec" (UID: "644fb5cf-0fad-4825-9975-46e8c5f3e1ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.118032 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "644fb5cf-0fad-4825-9975-46e8c5f3e1ec" (UID: "644fb5cf-0fad-4825-9975-46e8c5f3e1ec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.179454 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.179516 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.179528 4865 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.179540 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc24m\" (UniqueName: \"kubernetes.io/projected/644fb5cf-0fad-4825-9975-46e8c5f3e1ec-kube-api-access-wc24m\") on node \"crc\" DevicePath \"\"" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.486042 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" event={"ID":"644fb5cf-0fad-4825-9975-46e8c5f3e1ec","Type":"ContainerDied","Data":"6ecccfb4b3a6c5980967adbd341ec0f92a8a66f924dc3e48e1a22245a33a43cb"} Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.486098 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ecccfb4b3a6c5980967adbd341ec0f92a8a66f924dc3e48e1a22245a33a43cb" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.486173 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.620056 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm"] Dec 05 06:17:56 crc kubenswrapper[4865]: E1205 06:17:56.622427 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="extract-content" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.622909 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="extract-content" Dec 05 06:17:56 crc kubenswrapper[4865]: E1205 06:17:56.629965 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644fb5cf-0fad-4825-9975-46e8c5f3e1ec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.630085 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="644fb5cf-0fad-4825-9975-46e8c5f3e1ec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 06:17:56 crc kubenswrapper[4865]: E1205 06:17:56.630135 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="registry-server" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.630299 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="registry-server" Dec 05 06:17:56 crc kubenswrapper[4865]: E1205 06:17:56.630362 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="extract-utilities" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.630407 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="extract-utilities" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.630748 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="250a94d2-9568-419a-8b44-8ec1ab965eca" containerName="registry-server" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.630851 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="644fb5cf-0fad-4825-9975-46e8c5f3e1ec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.631558 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.636250 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm"] Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.688944 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.689075 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.689358 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.689590 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.707666 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.707743 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/8555d929-3dc5-4d7c-9635-fcc096789e43-kube-api-access-6zckv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.712031 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.815003 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.815332 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/8555d929-3dc5-4d7c-9635-fcc096789e43-kube-api-access-6zckv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.815580 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.821319 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.827539 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:56 crc kubenswrapper[4865]: I1205 06:17:56.832197 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/8555d929-3dc5-4d7c-9635-fcc096789e43-kube-api-access-6zckv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wwvsm\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:57 crc kubenswrapper[4865]: I1205 06:17:57.021759 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:17:57 crc kubenswrapper[4865]: I1205 06:17:57.638756 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm"] Dec 05 06:17:58 crc kubenswrapper[4865]: I1205 06:17:58.512534 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" event={"ID":"8555d929-3dc5-4d7c-9635-fcc096789e43","Type":"ContainerStarted","Data":"823bfe7972ab215f23050107c6351570c1e728da075b5eea919aecc1d2a92f10"} Dec 05 06:17:59 crc kubenswrapper[4865]: I1205 06:17:59.524922 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" event={"ID":"8555d929-3dc5-4d7c-9635-fcc096789e43","Type":"ContainerStarted","Data":"2a3ed2f441aaf9f775364f484be74b35961aac537edb7b7b8dd21e23cb6074ef"} Dec 05 06:17:59 crc kubenswrapper[4865]: I1205 06:17:59.554681 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" podStartSLOduration=2.700564742 podStartE2EDuration="3.554654722s" podCreationTimestamp="2025-12-05 06:17:56 +0000 UTC" firstStartedPulling="2025-12-05 06:17:57.641355979 +0000 UTC m=+1496.921367201" lastFinishedPulling="2025-12-05 06:17:58.495445959 +0000 UTC m=+1497.775457181" observedRunningTime="2025-12-05 06:17:59.545466854 +0000 UTC m=+1498.825478106" watchObservedRunningTime="2025-12-05 06:17:59.554654722 +0000 UTC m=+1498.834665974" Dec 05 06:18:01 crc kubenswrapper[4865]: I1205 06:18:01.544018 4865 generic.go:334] "Generic (PLEG): container finished" podID="8555d929-3dc5-4d7c-9635-fcc096789e43" containerID="2a3ed2f441aaf9f775364f484be74b35961aac537edb7b7b8dd21e23cb6074ef" exitCode=0 Dec 05 06:18:01 crc kubenswrapper[4865]: I1205 06:18:01.544509 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" event={"ID":"8555d929-3dc5-4d7c-9635-fcc096789e43","Type":"ContainerDied","Data":"2a3ed2f441aaf9f775364f484be74b35961aac537edb7b7b8dd21e23cb6074ef"} Dec 05 06:18:02 crc kubenswrapper[4865]: I1205 06:18:02.963091 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.069683 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-inventory\") pod \"8555d929-3dc5-4d7c-9635-fcc096789e43\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.069874 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-ssh-key\") pod \"8555d929-3dc5-4d7c-9635-fcc096789e43\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.070165 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/8555d929-3dc5-4d7c-9635-fcc096789e43-kube-api-access-6zckv\") pod \"8555d929-3dc5-4d7c-9635-fcc096789e43\" (UID: \"8555d929-3dc5-4d7c-9635-fcc096789e43\") " Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.076811 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8555d929-3dc5-4d7c-9635-fcc096789e43-kube-api-access-6zckv" (OuterVolumeSpecName: "kube-api-access-6zckv") pod "8555d929-3dc5-4d7c-9635-fcc096789e43" (UID: "8555d929-3dc5-4d7c-9635-fcc096789e43"). InnerVolumeSpecName "kube-api-access-6zckv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.109008 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8555d929-3dc5-4d7c-9635-fcc096789e43" (UID: "8555d929-3dc5-4d7c-9635-fcc096789e43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.116472 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-inventory" (OuterVolumeSpecName: "inventory") pod "8555d929-3dc5-4d7c-9635-fcc096789e43" (UID: "8555d929-3dc5-4d7c-9635-fcc096789e43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.172553 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/8555d929-3dc5-4d7c-9635-fcc096789e43-kube-api-access-6zckv\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.172592 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.172605 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8555d929-3dc5-4d7c-9635-fcc096789e43-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.566413 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" event={"ID":"8555d929-3dc5-4d7c-9635-fcc096789e43","Type":"ContainerDied","Data":"823bfe7972ab215f23050107c6351570c1e728da075b5eea919aecc1d2a92f10"} Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.566801 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823bfe7972ab215f23050107c6351570c1e728da075b5eea919aecc1d2a92f10" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.566536 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wwvsm" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.674176 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j"] Dec 05 06:18:03 crc kubenswrapper[4865]: E1205 06:18:03.674779 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8555d929-3dc5-4d7c-9635-fcc096789e43" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.674803 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8555d929-3dc5-4d7c-9635-fcc096789e43" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.675111 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8555d929-3dc5-4d7c-9635-fcc096789e43" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.676107 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.684435 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.684744 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.685036 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.685246 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.690180 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j"] Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.785887 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.786578 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f7bx\" (UniqueName: \"kubernetes.io/projected/ea0e7080-5e20-4b45-9896-2cda6b9e332f-kube-api-access-5f7bx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.786778 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.786907 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.888555 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.888612 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.888721 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.888765 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f7bx\" (UniqueName: \"kubernetes.io/projected/ea0e7080-5e20-4b45-9896-2cda6b9e332f-kube-api-access-5f7bx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.892190 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.892190 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.892698 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.909426 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f7bx\" (UniqueName: \"kubernetes.io/projected/ea0e7080-5e20-4b45-9896-2cda6b9e332f-kube-api-access-5f7bx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:03 crc kubenswrapper[4865]: I1205 06:18:03.994253 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:18:04 crc kubenswrapper[4865]: I1205 06:18:04.571923 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j"] Dec 05 06:18:05 crc kubenswrapper[4865]: I1205 06:18:05.589853 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" event={"ID":"ea0e7080-5e20-4b45-9896-2cda6b9e332f","Type":"ContainerStarted","Data":"5c835f421fd7c5f54dc953730de284f20c334dedd0961332bdc10d9281888f0c"} Dec 05 06:18:05 crc kubenswrapper[4865]: I1205 06:18:05.590175 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" event={"ID":"ea0e7080-5e20-4b45-9896-2cda6b9e332f","Type":"ContainerStarted","Data":"827e75372c84b26fe830a5edac483a0ac126f9b4f60872af156bbc94ecc9c6fa"} Dec 05 06:18:05 crc kubenswrapper[4865]: I1205 06:18:05.614571 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" podStartSLOduration=2.139316056 podStartE2EDuration="2.614546134s" podCreationTimestamp="2025-12-05 06:18:03 +0000 UTC" firstStartedPulling="2025-12-05 06:18:04.583949771 +0000 UTC m=+1503.863960993" lastFinishedPulling="2025-12-05 06:18:05.059179849 +0000 UTC m=+1504.339191071" observedRunningTime="2025-12-05 06:18:05.605727457 +0000 UTC m=+1504.885738699" watchObservedRunningTime="2025-12-05 06:18:05.614546134 +0000 UTC m=+1504.894557356" Dec 05 06:18:11 crc kubenswrapper[4865]: I1205 06:18:11.048513 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:18:11 crc kubenswrapper[4865]: I1205 06:18:11.049144 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.758449 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tfcn2"] Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.766084 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.801028 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfcn2"] Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.873916 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhl95\" (UniqueName: \"kubernetes.io/projected/a8bf48d9-c317-4766-8fec-4b12f052cc0e-kube-api-access-nhl95\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.874105 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-utilities\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.874489 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-catalog-content\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.976872 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhl95\" (UniqueName: \"kubernetes.io/projected/a8bf48d9-c317-4766-8fec-4b12f052cc0e-kube-api-access-nhl95\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.976974 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-utilities\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.977066 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-catalog-content\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.977609 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-catalog-content\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:16 crc kubenswrapper[4865]: I1205 06:18:16.977904 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-utilities\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:17 crc kubenswrapper[4865]: I1205 06:18:17.008293 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhl95\" (UniqueName: \"kubernetes.io/projected/a8bf48d9-c317-4766-8fec-4b12f052cc0e-kube-api-access-nhl95\") pod \"redhat-marketplace-tfcn2\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:17 crc kubenswrapper[4865]: I1205 06:18:17.109405 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:17 crc kubenswrapper[4865]: I1205 06:18:17.589563 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfcn2"] Dec 05 06:18:17 crc kubenswrapper[4865]: I1205 06:18:17.758082 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerStarted","Data":"eced6c42faadaf0308091766c2d32503ce4fd88c395f293ad49b3f84bb4a54dc"} Dec 05 06:18:18 crc kubenswrapper[4865]: I1205 06:18:18.771062 4865 generic.go:334] "Generic (PLEG): container finished" podID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerID="4c4f3dc8553e783e526d350fa8b32b1a0cf319b47b69596f1864d965165ea1ca" exitCode=0 Dec 05 06:18:18 crc kubenswrapper[4865]: I1205 06:18:18.771257 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerDied","Data":"4c4f3dc8553e783e526d350fa8b32b1a0cf319b47b69596f1864d965165ea1ca"} Dec 05 06:18:20 crc kubenswrapper[4865]: I1205 06:18:20.903149 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" podUID="5d3a98df-9953-49ab-a722-f37837073178" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.70:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:18:21 crc kubenswrapper[4865]: I1205 06:18:21.101523 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-5655c58dd6-nkfts" podUID="5d3a98df-9953-49ab-a722-f37837073178" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.70:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 06:18:21 crc kubenswrapper[4865]: I1205 06:18:21.106498 4865 trace.go:236] Trace[1502690237]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/certified-operators-vvxh5" (05-Dec-2025 06:18:19.764) (total time: 1341ms): Dec 05 06:18:21 crc kubenswrapper[4865]: Trace[1502690237]: [1.341966137s] [1.341966137s] END Dec 05 06:18:22 crc kubenswrapper[4865]: I1205 06:18:22.236808 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerStarted","Data":"70e374c3492a8594f651d985601bee20cdda46db5e21861382952a26b29d9ac9"} Dec 05 06:18:23 crc kubenswrapper[4865]: I1205 06:18:23.246553 4865 generic.go:334] "Generic (PLEG): container finished" podID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerID="70e374c3492a8594f651d985601bee20cdda46db5e21861382952a26b29d9ac9" exitCode=0 Dec 05 06:18:23 crc kubenswrapper[4865]: I1205 06:18:23.246657 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerDied","Data":"70e374c3492a8594f651d985601bee20cdda46db5e21861382952a26b29d9ac9"} Dec 05 06:18:24 crc kubenswrapper[4865]: I1205 06:18:24.259695 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerStarted","Data":"b0d674a664f7eac169c629f7d878f404cf704512349e162c2b04777dd1e5ea73"} Dec 05 06:18:24 crc kubenswrapper[4865]: I1205 06:18:24.292479 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tfcn2" podStartSLOduration=3.4325983620000002 podStartE2EDuration="8.292448869s" podCreationTimestamp="2025-12-05 06:18:16 +0000 UTC" firstStartedPulling="2025-12-05 06:18:18.773789581 +0000 UTC m=+1518.053800803" lastFinishedPulling="2025-12-05 06:18:23.633640088 +0000 UTC m=+1522.913651310" observedRunningTime="2025-12-05 06:18:24.285212474 +0000 UTC m=+1523.565223716" watchObservedRunningTime="2025-12-05 06:18:24.292448869 +0000 UTC m=+1523.572460091" Dec 05 06:18:27 crc kubenswrapper[4865]: I1205 06:18:27.109843 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:27 crc kubenswrapper[4865]: I1205 06:18:27.111498 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:27 crc kubenswrapper[4865]: I1205 06:18:27.162721 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:28 crc kubenswrapper[4865]: I1205 06:18:28.366919 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:28 crc kubenswrapper[4865]: I1205 06:18:28.420258 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfcn2"] Dec 05 06:18:30 crc kubenswrapper[4865]: I1205 06:18:30.322788 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tfcn2" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="registry-server" containerID="cri-o://b0d674a664f7eac169c629f7d878f404cf704512349e162c2b04777dd1e5ea73" gracePeriod=2 Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.343246 4865 generic.go:334] "Generic (PLEG): container finished" podID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerID="b0d674a664f7eac169c629f7d878f404cf704512349e162c2b04777dd1e5ea73" exitCode=0 Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.343516 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerDied","Data":"b0d674a664f7eac169c629f7d878f404cf704512349e162c2b04777dd1e5ea73"} Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.415190 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.592248 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhl95\" (UniqueName: \"kubernetes.io/projected/a8bf48d9-c317-4766-8fec-4b12f052cc0e-kube-api-access-nhl95\") pod \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.592441 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-catalog-content\") pod \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.592513 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-utilities\") pod \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\" (UID: \"a8bf48d9-c317-4766-8fec-4b12f052cc0e\") " Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.593772 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-utilities" (OuterVolumeSpecName: "utilities") pod "a8bf48d9-c317-4766-8fec-4b12f052cc0e" (UID: "a8bf48d9-c317-4766-8fec-4b12f052cc0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.601076 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bf48d9-c317-4766-8fec-4b12f052cc0e-kube-api-access-nhl95" (OuterVolumeSpecName: "kube-api-access-nhl95") pod "a8bf48d9-c317-4766-8fec-4b12f052cc0e" (UID: "a8bf48d9-c317-4766-8fec-4b12f052cc0e"). InnerVolumeSpecName "kube-api-access-nhl95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.619546 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8bf48d9-c317-4766-8fec-4b12f052cc0e" (UID: "a8bf48d9-c317-4766-8fec-4b12f052cc0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.694929 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhl95\" (UniqueName: \"kubernetes.io/projected/a8bf48d9-c317-4766-8fec-4b12f052cc0e-kube-api-access-nhl95\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.695189 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:31 crc kubenswrapper[4865]: I1205 06:18:31.695254 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bf48d9-c317-4766-8fec-4b12f052cc0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.356400 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfcn2" event={"ID":"a8bf48d9-c317-4766-8fec-4b12f052cc0e","Type":"ContainerDied","Data":"eced6c42faadaf0308091766c2d32503ce4fd88c395f293ad49b3f84bb4a54dc"} Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.356457 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfcn2" Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.356481 4865 scope.go:117] "RemoveContainer" containerID="b0d674a664f7eac169c629f7d878f404cf704512349e162c2b04777dd1e5ea73" Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.388315 4865 scope.go:117] "RemoveContainer" containerID="70e374c3492a8594f651d985601bee20cdda46db5e21861382952a26b29d9ac9" Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.395799 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfcn2"] Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.413354 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfcn2"] Dec 05 06:18:32 crc kubenswrapper[4865]: I1205 06:18:32.414808 4865 scope.go:117] "RemoveContainer" containerID="4c4f3dc8553e783e526d350fa8b32b1a0cf319b47b69596f1864d965165ea1ca" Dec 05 06:18:33 crc kubenswrapper[4865]: I1205 06:18:33.020362 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" path="/var/lib/kubelet/pods/a8bf48d9-c317-4766-8fec-4b12f052cc0e/volumes" Dec 05 06:18:36 crc kubenswrapper[4865]: I1205 06:18:36.791011 4865 scope.go:117] "RemoveContainer" containerID="7815231a3951ecc27a948b95402d899cce2a6aa10c5addd5981f6d55a4c1efe9" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.711871 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vjx99"] Dec 05 06:18:39 crc kubenswrapper[4865]: E1205 06:18:39.713714 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="registry-server" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.713794 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="registry-server" Dec 05 06:18:39 crc kubenswrapper[4865]: E1205 06:18:39.713876 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="extract-utilities" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.713929 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="extract-utilities" Dec 05 06:18:39 crc kubenswrapper[4865]: E1205 06:18:39.714005 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="extract-content" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.714060 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="extract-content" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.714330 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bf48d9-c317-4766-8fec-4b12f052cc0e" containerName="registry-server" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.715775 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.757433 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vjx99"] Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.869064 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrb2k\" (UniqueName: \"kubernetes.io/projected/133cb552-10d7-4050-80ea-db2f1570cb32-kube-api-access-zrb2k\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.869150 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-utilities\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.869244 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-catalog-content\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.970997 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-catalog-content\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.971332 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrb2k\" (UniqueName: \"kubernetes.io/projected/133cb552-10d7-4050-80ea-db2f1570cb32-kube-api-access-zrb2k\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.971415 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-utilities\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.972089 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-catalog-content\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:39 crc kubenswrapper[4865]: I1205 06:18:39.972258 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-utilities\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:40 crc kubenswrapper[4865]: I1205 06:18:40.001288 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrb2k\" (UniqueName: \"kubernetes.io/projected/133cb552-10d7-4050-80ea-db2f1570cb32-kube-api-access-zrb2k\") pod \"community-operators-vjx99\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:40 crc kubenswrapper[4865]: I1205 06:18:40.034564 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:40 crc kubenswrapper[4865]: I1205 06:18:40.552416 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vjx99"] Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.048799 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.049336 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.049396 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.050420 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.050495 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" gracePeriod=600 Dec 05 06:18:41 crc kubenswrapper[4865]: E1205 06:18:41.214465 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.452059 4865 generic.go:334] "Generic (PLEG): container finished" podID="133cb552-10d7-4050-80ea-db2f1570cb32" containerID="51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a" exitCode=0 Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.452141 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerDied","Data":"51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a"} Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.452170 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerStarted","Data":"cb18203eafe5a8f8bbc0cf2731389f4d3584c52a42da42db58b04b4e5a33094b"} Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.457157 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" exitCode=0 Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.457199 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e"} Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.457234 4865 scope.go:117] "RemoveContainer" containerID="39937e10b37729de9655b631fb05427006e716f9ab3edcd0d9c7edbbc9b5832a" Dec 05 06:18:41 crc kubenswrapper[4865]: I1205 06:18:41.457873 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:18:41 crc kubenswrapper[4865]: E1205 06:18:41.458124 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:18:42 crc kubenswrapper[4865]: I1205 06:18:42.470370 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerStarted","Data":"c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3"} Dec 05 06:18:43 crc kubenswrapper[4865]: I1205 06:18:43.482232 4865 generic.go:334] "Generic (PLEG): container finished" podID="133cb552-10d7-4050-80ea-db2f1570cb32" containerID="c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3" exitCode=0 Dec 05 06:18:43 crc kubenswrapper[4865]: I1205 06:18:43.482336 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerDied","Data":"c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3"} Dec 05 06:18:44 crc kubenswrapper[4865]: I1205 06:18:44.492964 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerStarted","Data":"b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25"} Dec 05 06:18:44 crc kubenswrapper[4865]: I1205 06:18:44.529614 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vjx99" podStartSLOduration=2.8691210099999997 podStartE2EDuration="5.529594922s" podCreationTimestamp="2025-12-05 06:18:39 +0000 UTC" firstStartedPulling="2025-12-05 06:18:41.454799853 +0000 UTC m=+1540.734811075" lastFinishedPulling="2025-12-05 06:18:44.115273775 +0000 UTC m=+1543.395284987" observedRunningTime="2025-12-05 06:18:44.524564706 +0000 UTC m=+1543.804575928" watchObservedRunningTime="2025-12-05 06:18:44.529594922 +0000 UTC m=+1543.809606144" Dec 05 06:18:50 crc kubenswrapper[4865]: I1205 06:18:50.035957 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:50 crc kubenswrapper[4865]: I1205 06:18:50.036582 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:50 crc kubenswrapper[4865]: I1205 06:18:50.096566 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:50 crc kubenswrapper[4865]: I1205 06:18:50.594262 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:50 crc kubenswrapper[4865]: I1205 06:18:50.651267 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vjx99"] Dec 05 06:18:52 crc kubenswrapper[4865]: I1205 06:18:52.568644 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vjx99" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="registry-server" containerID="cri-o://b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25" gracePeriod=2 Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.006169 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:18:53 crc kubenswrapper[4865]: E1205 06:18:53.006510 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.375436 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.504726 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-utilities\") pod \"133cb552-10d7-4050-80ea-db2f1570cb32\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.504785 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-catalog-content\") pod \"133cb552-10d7-4050-80ea-db2f1570cb32\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.504866 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrb2k\" (UniqueName: \"kubernetes.io/projected/133cb552-10d7-4050-80ea-db2f1570cb32-kube-api-access-zrb2k\") pod \"133cb552-10d7-4050-80ea-db2f1570cb32\" (UID: \"133cb552-10d7-4050-80ea-db2f1570cb32\") " Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.505763 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-utilities" (OuterVolumeSpecName: "utilities") pod "133cb552-10d7-4050-80ea-db2f1570cb32" (UID: "133cb552-10d7-4050-80ea-db2f1570cb32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.515906 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133cb552-10d7-4050-80ea-db2f1570cb32-kube-api-access-zrb2k" (OuterVolumeSpecName: "kube-api-access-zrb2k") pod "133cb552-10d7-4050-80ea-db2f1570cb32" (UID: "133cb552-10d7-4050-80ea-db2f1570cb32"). InnerVolumeSpecName "kube-api-access-zrb2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.558522 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "133cb552-10d7-4050-80ea-db2f1570cb32" (UID: "133cb552-10d7-4050-80ea-db2f1570cb32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.580734 4865 generic.go:334] "Generic (PLEG): container finished" podID="133cb552-10d7-4050-80ea-db2f1570cb32" containerID="b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25" exitCode=0 Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.580815 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vjx99" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.581876 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerDied","Data":"b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25"} Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.582004 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjx99" event={"ID":"133cb552-10d7-4050-80ea-db2f1570cb32","Type":"ContainerDied","Data":"cb18203eafe5a8f8bbc0cf2731389f4d3584c52a42da42db58b04b4e5a33094b"} Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.582082 4865 scope.go:117] "RemoveContainer" containerID="b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.613290 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.613334 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133cb552-10d7-4050-80ea-db2f1570cb32-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.613348 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrb2k\" (UniqueName: \"kubernetes.io/projected/133cb552-10d7-4050-80ea-db2f1570cb32-kube-api-access-zrb2k\") on node \"crc\" DevicePath \"\"" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.615801 4865 scope.go:117] "RemoveContainer" containerID="c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.620801 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vjx99"] Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.641904 4865 scope.go:117] "RemoveContainer" containerID="51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.647673 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vjx99"] Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.692778 4865 scope.go:117] "RemoveContainer" containerID="b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25" Dec 05 06:18:53 crc kubenswrapper[4865]: E1205 06:18:53.693393 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25\": container with ID starting with b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25 not found: ID does not exist" containerID="b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.693440 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25"} err="failed to get container status \"b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25\": rpc error: code = NotFound desc = could not find container \"b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25\": container with ID starting with b5bb3b14d226a05c741967698de7d070329ea6516b6f49136353ba8b8f901f25 not found: ID does not exist" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.693468 4865 scope.go:117] "RemoveContainer" containerID="c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3" Dec 05 06:18:53 crc kubenswrapper[4865]: E1205 06:18:53.693847 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3\": container with ID starting with c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3 not found: ID does not exist" containerID="c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.693877 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3"} err="failed to get container status \"c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3\": rpc error: code = NotFound desc = could not find container \"c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3\": container with ID starting with c956312fc3f72f6237550b02995a337f662c523a50185bc8098c0a7ff35b42b3 not found: ID does not exist" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.693895 4865 scope.go:117] "RemoveContainer" containerID="51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a" Dec 05 06:18:53 crc kubenswrapper[4865]: E1205 06:18:53.694115 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a\": container with ID starting with 51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a not found: ID does not exist" containerID="51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a" Dec 05 06:18:53 crc kubenswrapper[4865]: I1205 06:18:53.694148 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a"} err="failed to get container status \"51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a\": rpc error: code = NotFound desc = could not find container \"51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a\": container with ID starting with 51be9a7b9dea9c434826a1d24e0ca0e503d8ec3ed914b84da72232baedae6b6a not found: ID does not exist" Dec 05 06:18:55 crc kubenswrapper[4865]: I1205 06:18:55.019101 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" path="/var/lib/kubelet/pods/133cb552-10d7-4050-80ea-db2f1570cb32/volumes" Dec 05 06:19:06 crc kubenswrapper[4865]: I1205 06:19:06.007520 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:19:06 crc kubenswrapper[4865]: E1205 06:19:06.009643 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:19:20 crc kubenswrapper[4865]: I1205 06:19:20.006347 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:19:20 crc kubenswrapper[4865]: E1205 06:19:20.007040 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:19:32 crc kubenswrapper[4865]: I1205 06:19:32.006806 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:19:32 crc kubenswrapper[4865]: E1205 06:19:32.008024 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:19:45 crc kubenswrapper[4865]: I1205 06:19:45.007894 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:19:45 crc kubenswrapper[4865]: E1205 06:19:45.009249 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:19:58 crc kubenswrapper[4865]: I1205 06:19:58.006888 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:19:58 crc kubenswrapper[4865]: E1205 06:19:58.007761 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:20:11 crc kubenswrapper[4865]: I1205 06:20:11.015731 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:20:11 crc kubenswrapper[4865]: E1205 06:20:11.017434 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:20:26 crc kubenswrapper[4865]: I1205 06:20:26.006785 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:20:26 crc kubenswrapper[4865]: E1205 06:20:26.007743 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:20:38 crc kubenswrapper[4865]: I1205 06:20:38.006656 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:20:38 crc kubenswrapper[4865]: E1205 06:20:38.007787 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:20:44 crc kubenswrapper[4865]: I1205 06:20:44.068253 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-60a0-account-create-update-24m4m"] Dec 05 06:20:44 crc kubenswrapper[4865]: I1205 06:20:44.095656 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-60a0-account-create-update-24m4m"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.018551 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4fad1c-f15e-4d44-a71c-24d196f3c8fe" path="/var/lib/kubelet/pods/6d4fad1c-f15e-4d44-a71c-24d196f3c8fe/volumes" Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.066816 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f6dc-account-create-update-k7dxc"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.077117 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-thvvr"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.084426 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-89fd-account-create-update-w9tdj"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.091614 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f6dc-account-create-update-k7dxc"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.100109 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-kktpw"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.109772 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-89fd-account-create-update-w9tdj"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.117389 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-kktpw"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.124021 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-thvvr"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.130772 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-w8nx8"] Dec 05 06:20:45 crc kubenswrapper[4865]: I1205 06:20:45.147062 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-w8nx8"] Dec 05 06:20:47 crc kubenswrapper[4865]: I1205 06:20:47.024594 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18914495-0dfa-4528-ac93-942ccad6f5a3" path="/var/lib/kubelet/pods/18914495-0dfa-4528-ac93-942ccad6f5a3/volumes" Dec 05 06:20:47 crc kubenswrapper[4865]: I1205 06:20:47.025698 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295a1eb6-1b02-45f2-81ad-f5fae06d4146" path="/var/lib/kubelet/pods/295a1eb6-1b02-45f2-81ad-f5fae06d4146/volumes" Dec 05 06:20:47 crc kubenswrapper[4865]: I1205 06:20:47.026578 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32904ad2-4fdc-4dc2-9b0e-2726c0c30b37" path="/var/lib/kubelet/pods/32904ad2-4fdc-4dc2-9b0e-2726c0c30b37/volumes" Dec 05 06:20:47 crc kubenswrapper[4865]: I1205 06:20:47.027308 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8059be38-adc8-49f5-96f5-f5144c4ac8ee" path="/var/lib/kubelet/pods/8059be38-adc8-49f5-96f5-f5144c4ac8ee/volumes" Dec 05 06:20:47 crc kubenswrapper[4865]: I1205 06:20:47.028544 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ab494b-c8fa-42da-af71-d24aaaafe086" path="/var/lib/kubelet/pods/c3ab494b-c8fa-42da-af71-d24aaaafe086/volumes" Dec 05 06:20:53 crc kubenswrapper[4865]: I1205 06:20:53.006531 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:20:53 crc kubenswrapper[4865]: E1205 06:20:53.007245 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:21:05 crc kubenswrapper[4865]: I1205 06:21:05.007067 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:21:05 crc kubenswrapper[4865]: E1205 06:21:05.008104 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:21:16 crc kubenswrapper[4865]: I1205 06:21:16.006424 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:21:16 crc kubenswrapper[4865]: E1205 06:21:16.007313 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:21:16 crc kubenswrapper[4865]: I1205 06:21:16.066732 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dj2h4"] Dec 05 06:21:16 crc kubenswrapper[4865]: I1205 06:21:16.080913 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dj2h4"] Dec 05 06:21:17 crc kubenswrapper[4865]: I1205 06:21:17.030688 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d4176e-7d68-48c6-9e9a-c507558508ab" path="/var/lib/kubelet/pods/e6d4176e-7d68-48c6-9e9a-c507558508ab/volumes" Dec 05 06:21:30 crc kubenswrapper[4865]: I1205 06:21:30.006354 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:21:30 crc kubenswrapper[4865]: E1205 06:21:30.006989 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.072893 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-16c7-account-create-update-w2r86"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.081583 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bbd2l"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.089895 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2426-account-create-update-jpx49"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.096919 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mf52x"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.105142 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-16c7-account-create-update-w2r86"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.112322 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6eb1-account-create-update-czskk"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.119545 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mf52x"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.127949 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bbd2l"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.137543 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2426-account-create-update-jpx49"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.147746 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ntj72"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.157196 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ntj72"] Dec 05 06:21:32 crc kubenswrapper[4865]: I1205 06:21:32.166329 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6eb1-account-create-update-czskk"] Dec 05 06:21:33 crc kubenswrapper[4865]: I1205 06:21:33.024152 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11931045-4e2b-4720-9f3b-745b2215e3ac" path="/var/lib/kubelet/pods/11931045-4e2b-4720-9f3b-745b2215e3ac/volumes" Dec 05 06:21:33 crc kubenswrapper[4865]: I1205 06:21:33.026765 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30202162-03c8-4d05-b31d-fbb7900ae067" path="/var/lib/kubelet/pods/30202162-03c8-4d05-b31d-fbb7900ae067/volumes" Dec 05 06:21:33 crc kubenswrapper[4865]: I1205 06:21:33.027643 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ad1f76-ae1d-4f90-9fcd-819dc43fc598" path="/var/lib/kubelet/pods/49ad1f76-ae1d-4f90-9fcd-819dc43fc598/volumes" Dec 05 06:21:33 crc kubenswrapper[4865]: I1205 06:21:33.028892 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a615ef-1dfa-45c7-8d5d-bb694d7d13ad" path="/var/lib/kubelet/pods/66a615ef-1dfa-45c7-8d5d-bb694d7d13ad/volumes" Dec 05 06:21:33 crc kubenswrapper[4865]: I1205 06:21:33.030146 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d26b7c-932e-4c37-95d3-9f5dc0b874b5" path="/var/lib/kubelet/pods/75d26b7c-932e-4c37-95d3-9f5dc0b874b5/volumes" Dec 05 06:21:33 crc kubenswrapper[4865]: I1205 06:21:33.031618 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fde195ed-2c6b-465f-b496-97c7d604f1c6" path="/var/lib/kubelet/pods/fde195ed-2c6b-465f-b496-97c7d604f1c6/volumes" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.006466 4865 scope.go:117] "RemoveContainer" containerID="bd379a7f707eb9e4bcd18d9f4d4e47111ad3d132718a50ba1a509fc5cb1dac96" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.063927 4865 scope.go:117] "RemoveContainer" containerID="e36bca61d1c70996f3fe55847ad2ef9574f572837d618bf7422a89620a376f28" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.124206 4865 scope.go:117] "RemoveContainer" containerID="9284e885fd5bc8313d85d153dfeab728819bf379c36989710816d5947b445b0a" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.158345 4865 scope.go:117] "RemoveContainer" containerID="72c6191d824e26de87e9c5027b3f37ce377b6322b626e2cf4c741959ccbe8da9" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.198012 4865 scope.go:117] "RemoveContainer" containerID="4dde5a9d8f8a9f7a0e3e18ebb1c3818bf01d24ffbff326e631bb27f89a42531c" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.235571 4865 scope.go:117] "RemoveContainer" containerID="b2dd81e91639317e8627af8ae3f58a29590afd371319684aff06671931431e02" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.276754 4865 scope.go:117] "RemoveContainer" containerID="89fca49ca21b8feac967a5eeffa7bb4c719b9c5501b5233893286f0ae5f9189f" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.298222 4865 scope.go:117] "RemoveContainer" containerID="ba16dd313ee63c2b879a00bf928c24910a963011ac18e2191dbd94c0823e66d9" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.320743 4865 scope.go:117] "RemoveContainer" containerID="8e5c2d3d4e7df6172c4efc9f25b33f8a738a71181a274dfd788c0efdff9b041e" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.345354 4865 scope.go:117] "RemoveContainer" containerID="5abc0ceebbe39a8c1a04192bd44fc9eb3f7e855167250463cb468c6d6b069e96" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.366958 4865 scope.go:117] "RemoveContainer" containerID="8abad379185f9c1e2a91321dbcd07e7a65581e638b884992b75c260c2a6c3be1" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.385985 4865 scope.go:117] "RemoveContainer" containerID="49ec4a20d00a1d81303f01e61c881bb5a94b7b365bf85e056ff3ec74e256f265" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.407951 4865 scope.go:117] "RemoveContainer" containerID="e6dcc2573a6d56fc745e9ae9f9b1e60b32677fa1ced22780bfc93d86ee54d7c7" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.452057 4865 scope.go:117] "RemoveContainer" containerID="6ec36b8581d4250a63a5fcf8a2c7398a8913c045b64c12b12db1744a4cbc776b" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.475736 4865 scope.go:117] "RemoveContainer" containerID="dcd36b620cd531d364f2b49652887b92dafe53598709ef92028c21c00e70f2c4" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.507455 4865 scope.go:117] "RemoveContainer" containerID="d7b0c16db59b802d99b2c3229577baf8f2f85618e20d780445d3e57a955896af" Dec 05 06:21:37 crc kubenswrapper[4865]: I1205 06:21:37.529964 4865 scope.go:117] "RemoveContainer" containerID="7091a08c26e97d59579c6ea300531c523e6460ca30e5d89e01208cd1e166a674" Dec 05 06:21:41 crc kubenswrapper[4865]: I1205 06:21:41.045648 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mh5z9"] Dec 05 06:21:41 crc kubenswrapper[4865]: I1205 06:21:41.064973 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mh5z9"] Dec 05 06:21:42 crc kubenswrapper[4865]: I1205 06:21:42.006690 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:21:42 crc kubenswrapper[4865]: E1205 06:21:42.007318 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:21:43 crc kubenswrapper[4865]: I1205 06:21:43.022938 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b061e42-22fc-4b4b-8ebe-20496b1a9f17" path="/var/lib/kubelet/pods/2b061e42-22fc-4b4b-8ebe-20496b1a9f17/volumes" Dec 05 06:21:55 crc kubenswrapper[4865]: I1205 06:21:55.007147 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:21:55 crc kubenswrapper[4865]: E1205 06:21:55.007886 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:21:56 crc kubenswrapper[4865]: I1205 06:21:56.045038 4865 generic.go:334] "Generic (PLEG): container finished" podID="ea0e7080-5e20-4b45-9896-2cda6b9e332f" containerID="5c835f421fd7c5f54dc953730de284f20c334dedd0961332bdc10d9281888f0c" exitCode=0 Dec 05 06:21:56 crc kubenswrapper[4865]: I1205 06:21:56.045362 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" event={"ID":"ea0e7080-5e20-4b45-9896-2cda6b9e332f","Type":"ContainerDied","Data":"5c835f421fd7c5f54dc953730de284f20c334dedd0961332bdc10d9281888f0c"} Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.654414 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.740570 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-inventory\") pod \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.740631 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-ssh-key\") pod \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.740732 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f7bx\" (UniqueName: \"kubernetes.io/projected/ea0e7080-5e20-4b45-9896-2cda6b9e332f-kube-api-access-5f7bx\") pod \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.741019 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-bootstrap-combined-ca-bundle\") pod \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\" (UID: \"ea0e7080-5e20-4b45-9896-2cda6b9e332f\") " Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.746195 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0e7080-5e20-4b45-9896-2cda6b9e332f-kube-api-access-5f7bx" (OuterVolumeSpecName: "kube-api-access-5f7bx") pod "ea0e7080-5e20-4b45-9896-2cda6b9e332f" (UID: "ea0e7080-5e20-4b45-9896-2cda6b9e332f"). InnerVolumeSpecName "kube-api-access-5f7bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.749157 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ea0e7080-5e20-4b45-9896-2cda6b9e332f" (UID: "ea0e7080-5e20-4b45-9896-2cda6b9e332f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.787755 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea0e7080-5e20-4b45-9896-2cda6b9e332f" (UID: "ea0e7080-5e20-4b45-9896-2cda6b9e332f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.798738 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-inventory" (OuterVolumeSpecName: "inventory") pod "ea0e7080-5e20-4b45-9896-2cda6b9e332f" (UID: "ea0e7080-5e20-4b45-9896-2cda6b9e332f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.844521 4865 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.844549 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.844558 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0e7080-5e20-4b45-9896-2cda6b9e332f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:21:57 crc kubenswrapper[4865]: I1205 06:21:57.844566 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f7bx\" (UniqueName: \"kubernetes.io/projected/ea0e7080-5e20-4b45-9896-2cda6b9e332f-kube-api-access-5f7bx\") on node \"crc\" DevicePath \"\"" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.091404 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" event={"ID":"ea0e7080-5e20-4b45-9896-2cda6b9e332f","Type":"ContainerDied","Data":"827e75372c84b26fe830a5edac483a0ac126f9b4f60872af156bbc94ecc9c6fa"} Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.091456 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="827e75372c84b26fe830a5edac483a0ac126f9b4f60872af156bbc94ecc9c6fa" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.091560 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.188183 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt"] Dec 05 06:21:58 crc kubenswrapper[4865]: E1205 06:21:58.188651 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="extract-content" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.188667 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="extract-content" Dec 05 06:21:58 crc kubenswrapper[4865]: E1205 06:21:58.188692 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="registry-server" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.188698 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="registry-server" Dec 05 06:21:58 crc kubenswrapper[4865]: E1205 06:21:58.188712 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="extract-utilities" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.188718 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="extract-utilities" Dec 05 06:21:58 crc kubenswrapper[4865]: E1205 06:21:58.188751 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0e7080-5e20-4b45-9896-2cda6b9e332f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.188758 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0e7080-5e20-4b45-9896-2cda6b9e332f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.188958 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="133cb552-10d7-4050-80ea-db2f1570cb32" containerName="registry-server" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.189008 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0e7080-5e20-4b45-9896-2cda6b9e332f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.189616 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.195095 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.195096 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.195274 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.195424 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.208219 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt"] Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.353368 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.353473 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2f98\" (UniqueName: \"kubernetes.io/projected/e0f77448-e553-45f7-90db-3a800258bdf3-kube-api-access-q2f98\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.353495 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.455733 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2f98\" (UniqueName: \"kubernetes.io/projected/e0f77448-e553-45f7-90db-3a800258bdf3-kube-api-access-q2f98\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.455777 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.455958 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.460382 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.463427 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.478164 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2f98\" (UniqueName: \"kubernetes.io/projected/e0f77448-e553-45f7-90db-3a800258bdf3-kube-api-access-q2f98\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:58 crc kubenswrapper[4865]: I1205 06:21:58.515442 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:21:59 crc kubenswrapper[4865]: I1205 06:21:59.139051 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:21:59 crc kubenswrapper[4865]: I1205 06:21:59.139754 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt"] Dec 05 06:22:00 crc kubenswrapper[4865]: I1205 06:22:00.113273 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" event={"ID":"e0f77448-e553-45f7-90db-3a800258bdf3","Type":"ContainerStarted","Data":"b6b98eff35e309fe4217600fd7571598659ccfdf2949dc28647c5fef5929d21f"} Dec 05 06:22:00 crc kubenswrapper[4865]: I1205 06:22:00.113491 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" event={"ID":"e0f77448-e553-45f7-90db-3a800258bdf3","Type":"ContainerStarted","Data":"07bbd23026c2299cd10cec38f3bddac12377700a52c47e848d399107186193bd"} Dec 05 06:22:00 crc kubenswrapper[4865]: I1205 06:22:00.143863 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" podStartSLOduration=1.6405796480000001 podStartE2EDuration="2.143844277s" podCreationTimestamp="2025-12-05 06:21:58 +0000 UTC" firstStartedPulling="2025-12-05 06:21:59.138773751 +0000 UTC m=+1738.418784973" lastFinishedPulling="2025-12-05 06:21:59.64203838 +0000 UTC m=+1738.922049602" observedRunningTime="2025-12-05 06:22:00.133816573 +0000 UTC m=+1739.413827795" watchObservedRunningTime="2025-12-05 06:22:00.143844277 +0000 UTC m=+1739.423855499" Dec 05 06:22:10 crc kubenswrapper[4865]: I1205 06:22:10.006071 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:22:10 crc kubenswrapper[4865]: E1205 06:22:10.006760 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:22:21 crc kubenswrapper[4865]: I1205 06:22:21.049853 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-g8vbt"] Dec 05 06:22:21 crc kubenswrapper[4865]: I1205 06:22:21.059523 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-g8vbt"] Dec 05 06:22:22 crc kubenswrapper[4865]: I1205 06:22:22.343475 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:22:22 crc kubenswrapper[4865]: E1205 06:22:22.344733 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:22:23 crc kubenswrapper[4865]: I1205 06:22:23.019201 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4021310b-c06b-44b3-9c95-7ca10552da10" path="/var/lib/kubelet/pods/4021310b-c06b-44b3-9c95-7ca10552da10/volumes" Dec 05 06:22:34 crc kubenswrapper[4865]: I1205 06:22:34.007315 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:22:34 crc kubenswrapper[4865]: E1205 06:22:34.008197 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.071560 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mn58x"] Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.082871 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lcs2x"] Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.093337 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xs9sp"] Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.104120 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lcs2x"] Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.112523 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xs9sp"] Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.121349 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mn58x"] Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.807885 4865 scope.go:117] "RemoveContainer" containerID="56ab844b39f7f229e643ac1e2e555f59ffc130412970cd5a6d57b22a5128c88c" Dec 05 06:22:37 crc kubenswrapper[4865]: I1205 06:22:37.853412 4865 scope.go:117] "RemoveContainer" containerID="f8e43214c4c3a9e4ed8bbd0e1bce8943bef1b7b3c16c4984f7e3aa526397e66f" Dec 05 06:22:39 crc kubenswrapper[4865]: I1205 06:22:39.018155 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de4159c-2d90-4b3a-bcff-84f293a59c35" path="/var/lib/kubelet/pods/1de4159c-2d90-4b3a-bcff-84f293a59c35/volumes" Dec 05 06:22:39 crc kubenswrapper[4865]: I1205 06:22:39.019893 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f798138-a4f1-490f-8904-cfccbf0db793" path="/var/lib/kubelet/pods/4f798138-a4f1-490f-8904-cfccbf0db793/volumes" Dec 05 06:22:39 crc kubenswrapper[4865]: I1205 06:22:39.020974 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bbdbcb-6f86-41ff-99bc-1af813144fd4" path="/var/lib/kubelet/pods/96bbdbcb-6f86-41ff-99bc-1af813144fd4/volumes" Dec 05 06:22:48 crc kubenswrapper[4865]: I1205 06:22:48.006652 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:22:48 crc kubenswrapper[4865]: E1205 06:22:48.008773 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:22:54 crc kubenswrapper[4865]: I1205 06:22:54.052863 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f7d5b"] Dec 05 06:22:54 crc kubenswrapper[4865]: I1205 06:22:54.062382 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f7d5b"] Dec 05 06:22:55 crc kubenswrapper[4865]: I1205 06:22:55.019208 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e4dce7-c9e7-4813-a957-1df502644792" path="/var/lib/kubelet/pods/b5e4dce7-c9e7-4813-a957-1df502644792/volumes" Dec 05 06:23:03 crc kubenswrapper[4865]: I1205 06:23:03.007631 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:23:03 crc kubenswrapper[4865]: E1205 06:23:03.010626 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:23:16 crc kubenswrapper[4865]: I1205 06:23:16.007185 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:23:16 crc kubenswrapper[4865]: E1205 06:23:16.008218 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:23:31 crc kubenswrapper[4865]: I1205 06:23:31.015318 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:23:31 crc kubenswrapper[4865]: E1205 06:23:31.016543 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.061509 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6lclk"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.082484 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-eee8-account-create-update-2jnjx"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.091625 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8qkdx"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.103138 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4d95-account-create-update-xsbvp"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.110595 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8qkdx"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.118237 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4d95-account-create-update-xsbvp"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.128636 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6lclk"] Dec 05 06:23:33 crc kubenswrapper[4865]: I1205 06:23:33.139810 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-eee8-account-create-update-2jnjx"] Dec 05 06:23:34 crc kubenswrapper[4865]: I1205 06:23:34.032736 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-r4lqw"] Dec 05 06:23:34 crc kubenswrapper[4865]: I1205 06:23:34.044727 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f091-account-create-update-c4wdz"] Dec 05 06:23:34 crc kubenswrapper[4865]: I1205 06:23:34.055640 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-r4lqw"] Dec 05 06:23:34 crc kubenswrapper[4865]: I1205 06:23:34.065677 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f091-account-create-update-c4wdz"] Dec 05 06:23:35 crc kubenswrapper[4865]: I1205 06:23:35.024097 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cec8b30-efea-4cd5-be6c-889b5dc59008" path="/var/lib/kubelet/pods/2cec8b30-efea-4cd5-be6c-889b5dc59008/volumes" Dec 05 06:23:35 crc kubenswrapper[4865]: I1205 06:23:35.024943 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519c901d-31cc-4600-96bb-66ecd95aba90" path="/var/lib/kubelet/pods/519c901d-31cc-4600-96bb-66ecd95aba90/volumes" Dec 05 06:23:35 crc kubenswrapper[4865]: I1205 06:23:35.025635 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74b7332b-f0c3-4e97-9149-0b09e4f74727" path="/var/lib/kubelet/pods/74b7332b-f0c3-4e97-9149-0b09e4f74727/volumes" Dec 05 06:23:35 crc kubenswrapper[4865]: I1205 06:23:35.026864 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7733d2ef-e9ed-4977-b294-3941be6b9455" path="/var/lib/kubelet/pods/7733d2ef-e9ed-4977-b294-3941be6b9455/volumes" Dec 05 06:23:35 crc kubenswrapper[4865]: I1205 06:23:35.028302 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8240c19-58ea-4d42-ac99-121c7f01e2f2" path="/var/lib/kubelet/pods/a8240c19-58ea-4d42-ac99-121c7f01e2f2/volumes" Dec 05 06:23:35 crc kubenswrapper[4865]: I1205 06:23:35.029239 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce59a384-76dd-444f-8e42-e4eb194e48e9" path="/var/lib/kubelet/pods/ce59a384-76dd-444f-8e42-e4eb194e48e9/volumes" Dec 05 06:23:37 crc kubenswrapper[4865]: I1205 06:23:37.935259 4865 scope.go:117] "RemoveContainer" containerID="cad80ed9ffc04d8f5694f91d5bd9697516287fb44fa1b3cf466f846b4b5584b7" Dec 05 06:23:37 crc kubenswrapper[4865]: I1205 06:23:37.975218 4865 scope.go:117] "RemoveContainer" containerID="a9145a9a4c3cb9abfd5ddb8a4ccea3b5c568087506ed84014d0eb1eb2681eb5c" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.027596 4865 scope.go:117] "RemoveContainer" containerID="32a3bbce1be81ebb05c72067d0b57678445c07251b83bb02f85a5a839fb2dfca" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.068616 4865 scope.go:117] "RemoveContainer" containerID="58f16f1f4e44058939b4a82217da1cf07e666d16acef06252e4f4dfa4c57709f" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.112427 4865 scope.go:117] "RemoveContainer" containerID="066d65d1a75ea3c41d2ddb1ec2ab58bf494f925726dba1b8b5732272a6307b53" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.162119 4865 scope.go:117] "RemoveContainer" containerID="df462c8b519563ff848c3dfe981818298a9b61625006590653518fd116417fd6" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.218784 4865 scope.go:117] "RemoveContainer" containerID="d7e05362ff84d81aa3cde40cc9f1c8996030cf683f693687fb8224024bf65310" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.245253 4865 scope.go:117] "RemoveContainer" containerID="5be37aa6e0a55b62c7458bf4560c5ec09f2218715a4863038da8dfbf2f86bdf4" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.266476 4865 scope.go:117] "RemoveContainer" containerID="1041512cc1fbafb37ee520703e17b180cfd4480e7fa2f54343a1bb3e79106b4d" Dec 05 06:23:38 crc kubenswrapper[4865]: I1205 06:23:38.290106 4865 scope.go:117] "RemoveContainer" containerID="9618f2107757ba62a6db52c30717edd999f8918dfb8277916fb5e111d328acbe" Dec 05 06:23:44 crc kubenswrapper[4865]: I1205 06:23:44.007304 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:23:45 crc kubenswrapper[4865]: I1205 06:23:45.228668 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"8298b19a58cec01e49fca2a020c44af4ff6818830b5baef2bd67270b8780a994"} Dec 05 06:24:07 crc kubenswrapper[4865]: I1205 06:24:07.078032 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-g2g6p"] Dec 05 06:24:07 crc kubenswrapper[4865]: I1205 06:24:07.088902 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-g2g6p"] Dec 05 06:24:09 crc kubenswrapper[4865]: I1205 06:24:09.018702 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1" path="/var/lib/kubelet/pods/9cdda782-13a7-4c36-a8f3-7b1d09fd2ca1/volumes" Dec 05 06:24:09 crc kubenswrapper[4865]: I1205 06:24:09.445290 4865 generic.go:334] "Generic (PLEG): container finished" podID="e0f77448-e553-45f7-90db-3a800258bdf3" containerID="b6b98eff35e309fe4217600fd7571598659ccfdf2949dc28647c5fef5929d21f" exitCode=0 Dec 05 06:24:09 crc kubenswrapper[4865]: I1205 06:24:09.445343 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" event={"ID":"e0f77448-e553-45f7-90db-3a800258bdf3","Type":"ContainerDied","Data":"b6b98eff35e309fe4217600fd7571598659ccfdf2949dc28647c5fef5929d21f"} Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.885544 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.889324 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-inventory\") pod \"e0f77448-e553-45f7-90db-3a800258bdf3\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.889471 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-ssh-key\") pod \"e0f77448-e553-45f7-90db-3a800258bdf3\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.889627 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2f98\" (UniqueName: \"kubernetes.io/projected/e0f77448-e553-45f7-90db-3a800258bdf3-kube-api-access-q2f98\") pod \"e0f77448-e553-45f7-90db-3a800258bdf3\" (UID: \"e0f77448-e553-45f7-90db-3a800258bdf3\") " Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.895767 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f77448-e553-45f7-90db-3a800258bdf3-kube-api-access-q2f98" (OuterVolumeSpecName: "kube-api-access-q2f98") pod "e0f77448-e553-45f7-90db-3a800258bdf3" (UID: "e0f77448-e553-45f7-90db-3a800258bdf3"). InnerVolumeSpecName "kube-api-access-q2f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.928068 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0f77448-e553-45f7-90db-3a800258bdf3" (UID: "e0f77448-e553-45f7-90db-3a800258bdf3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.944192 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-inventory" (OuterVolumeSpecName: "inventory") pod "e0f77448-e553-45f7-90db-3a800258bdf3" (UID: "e0f77448-e553-45f7-90db-3a800258bdf3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.991972 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2f98\" (UniqueName: \"kubernetes.io/projected/e0f77448-e553-45f7-90db-3a800258bdf3-kube-api-access-q2f98\") on node \"crc\" DevicePath \"\"" Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.992002 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:24:10 crc kubenswrapper[4865]: I1205 06:24:10.992012 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f77448-e553-45f7-90db-3a800258bdf3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.468245 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" event={"ID":"e0f77448-e553-45f7-90db-3a800258bdf3","Type":"ContainerDied","Data":"07bbd23026c2299cd10cec38f3bddac12377700a52c47e848d399107186193bd"} Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.468294 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07bbd23026c2299cd10cec38f3bddac12377700a52c47e848d399107186193bd" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.468807 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.574061 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z"] Dec 05 06:24:11 crc kubenswrapper[4865]: E1205 06:24:11.574659 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f77448-e553-45f7-90db-3a800258bdf3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.574742 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f77448-e553-45f7-90db-3a800258bdf3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.575013 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f77448-e553-45f7-90db-3a800258bdf3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.575765 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.585246 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.585315 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.585392 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.585879 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z"] Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.585899 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.613460 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.613522 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4x6l\" (UniqueName: \"kubernetes.io/projected/7ba48e1b-5d9a-436a-8250-297390ed1781-kube-api-access-n4x6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.613598 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.715697 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.715753 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4x6l\" (UniqueName: \"kubernetes.io/projected/7ba48e1b-5d9a-436a-8250-297390ed1781-kube-api-access-n4x6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.715816 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.719637 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.722593 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.732113 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4x6l\" (UniqueName: \"kubernetes.io/projected/7ba48e1b-5d9a-436a-8250-297390ed1781-kube-api-access-n4x6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2652z\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:11 crc kubenswrapper[4865]: I1205 06:24:11.913195 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:24:12 crc kubenswrapper[4865]: I1205 06:24:12.532074 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z"] Dec 05 06:24:13 crc kubenswrapper[4865]: I1205 06:24:13.495888 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" event={"ID":"7ba48e1b-5d9a-436a-8250-297390ed1781","Type":"ContainerStarted","Data":"58a98791d2edf2f7fa8b72593fd549a8bd99760872edffe629f4993d968e95d9"} Dec 05 06:24:15 crc kubenswrapper[4865]: I1205 06:24:15.519676 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" event={"ID":"7ba48e1b-5d9a-436a-8250-297390ed1781","Type":"ContainerStarted","Data":"521c29344ff1cc6b92a7929c3a9068c2bdc1a1c7acbe710b5e8206840bf895e3"} Dec 05 06:24:15 crc kubenswrapper[4865]: I1205 06:24:15.538265 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" podStartSLOduration=2.719266313 podStartE2EDuration="4.538247928s" podCreationTimestamp="2025-12-05 06:24:11 +0000 UTC" firstStartedPulling="2025-12-05 06:24:12.54288858 +0000 UTC m=+1871.822899822" lastFinishedPulling="2025-12-05 06:24:14.361870215 +0000 UTC m=+1873.641881437" observedRunningTime="2025-12-05 06:24:15.536250842 +0000 UTC m=+1874.816262084" watchObservedRunningTime="2025-12-05 06:24:15.538247928 +0000 UTC m=+1874.818259150" Dec 05 06:24:30 crc kubenswrapper[4865]: I1205 06:24:30.059639 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lt8qn"] Dec 05 06:24:30 crc kubenswrapper[4865]: I1205 06:24:30.077148 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lt8qn"] Dec 05 06:24:31 crc kubenswrapper[4865]: I1205 06:24:31.029801 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67260032-55e2-4709-84b1-577259ffa891" path="/var/lib/kubelet/pods/67260032-55e2-4709-84b1-577259ffa891/volumes" Dec 05 06:24:32 crc kubenswrapper[4865]: I1205 06:24:32.035128 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-skfnv"] Dec 05 06:24:32 crc kubenswrapper[4865]: I1205 06:24:32.043185 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-skfnv"] Dec 05 06:24:33 crc kubenswrapper[4865]: I1205 06:24:33.017652 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479f47b5-b756-41b2-af12-ca6fcbe867a3" path="/var/lib/kubelet/pods/479f47b5-b756-41b2-af12-ca6fcbe867a3/volumes" Dec 05 06:24:38 crc kubenswrapper[4865]: I1205 06:24:38.494841 4865 scope.go:117] "RemoveContainer" containerID="a8ec670a519c895770256b56e62232ad7cd36b7e39e166cdefeec80fa4470e4d" Dec 05 06:24:38 crc kubenswrapper[4865]: I1205 06:24:38.545792 4865 scope.go:117] "RemoveContainer" containerID="ba7bfaeeb4e0660eb347f24521eb2c636d6843df8db40fff9d365c3fe42dbabb" Dec 05 06:24:38 crc kubenswrapper[4865]: I1205 06:24:38.613296 4865 scope.go:117] "RemoveContainer" containerID="6870896ac2ef3a4699b002484f042cdd44757cdd6a9115f2c877d993d650e74e" Dec 05 06:25:16 crc kubenswrapper[4865]: I1205 06:25:16.042575 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vfgpb"] Dec 05 06:25:16 crc kubenswrapper[4865]: I1205 06:25:16.063076 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vfgpb"] Dec 05 06:25:17 crc kubenswrapper[4865]: I1205 06:25:17.018606 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573cd206-8f29-473e-8394-e862c8ef17e5" path="/var/lib/kubelet/pods/573cd206-8f29-473e-8394-e862c8ef17e5/volumes" Dec 05 06:25:38 crc kubenswrapper[4865]: I1205 06:25:38.747908 4865 scope.go:117] "RemoveContainer" containerID="34bfcd69dd61aeb075770fc5b52a4b2474415129d3302513db78c50edaffde24" Dec 05 06:25:43 crc kubenswrapper[4865]: I1205 06:25:43.212705 4865 generic.go:334] "Generic (PLEG): container finished" podID="7ba48e1b-5d9a-436a-8250-297390ed1781" containerID="521c29344ff1cc6b92a7929c3a9068c2bdc1a1c7acbe710b5e8206840bf895e3" exitCode=0 Dec 05 06:25:43 crc kubenswrapper[4865]: I1205 06:25:43.213155 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" event={"ID":"7ba48e1b-5d9a-436a-8250-297390ed1781","Type":"ContainerDied","Data":"521c29344ff1cc6b92a7929c3a9068c2bdc1a1c7acbe710b5e8206840bf895e3"} Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.736914 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.911137 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4x6l\" (UniqueName: \"kubernetes.io/projected/7ba48e1b-5d9a-436a-8250-297390ed1781-kube-api-access-n4x6l\") pod \"7ba48e1b-5d9a-436a-8250-297390ed1781\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.911230 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-inventory\") pod \"7ba48e1b-5d9a-436a-8250-297390ed1781\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.911347 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-ssh-key\") pod \"7ba48e1b-5d9a-436a-8250-297390ed1781\" (UID: \"7ba48e1b-5d9a-436a-8250-297390ed1781\") " Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.921369 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba48e1b-5d9a-436a-8250-297390ed1781-kube-api-access-n4x6l" (OuterVolumeSpecName: "kube-api-access-n4x6l") pod "7ba48e1b-5d9a-436a-8250-297390ed1781" (UID: "7ba48e1b-5d9a-436a-8250-297390ed1781"). InnerVolumeSpecName "kube-api-access-n4x6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.946309 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-inventory" (OuterVolumeSpecName: "inventory") pod "7ba48e1b-5d9a-436a-8250-297390ed1781" (UID: "7ba48e1b-5d9a-436a-8250-297390ed1781"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:25:44 crc kubenswrapper[4865]: I1205 06:25:44.962047 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ba48e1b-5d9a-436a-8250-297390ed1781" (UID: "7ba48e1b-5d9a-436a-8250-297390ed1781"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.015633 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.015857 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4x6l\" (UniqueName: \"kubernetes.io/projected/7ba48e1b-5d9a-436a-8250-297390ed1781-kube-api-access-n4x6l\") on node \"crc\" DevicePath \"\"" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.015927 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba48e1b-5d9a-436a-8250-297390ed1781-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.235515 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" event={"ID":"7ba48e1b-5d9a-436a-8250-297390ed1781","Type":"ContainerDied","Data":"58a98791d2edf2f7fa8b72593fd549a8bd99760872edffe629f4993d968e95d9"} Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.235594 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58a98791d2edf2f7fa8b72593fd549a8bd99760872edffe629f4993d968e95d9" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.235698 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2652z" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.345107 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2"] Dec 05 06:25:45 crc kubenswrapper[4865]: E1205 06:25:45.345657 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba48e1b-5d9a-436a-8250-297390ed1781" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.345679 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba48e1b-5d9a-436a-8250-297390ed1781" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.345983 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba48e1b-5d9a-436a-8250-297390ed1781" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.346869 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.349906 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.354195 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.354207 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.360497 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2"] Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.360785 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.526079 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.526945 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.527015 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcgc\" (UniqueName: \"kubernetes.io/projected/f3b4e4ee-2945-4c62-97e2-c561996ed302-kube-api-access-6qcgc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.629367 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcgc\" (UniqueName: \"kubernetes.io/projected/f3b4e4ee-2945-4c62-97e2-c561996ed302-kube-api-access-6qcgc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.629601 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.629713 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.634866 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.636155 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.647497 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcgc\" (UniqueName: \"kubernetes.io/projected/f3b4e4ee-2945-4c62-97e2-c561996ed302-kube-api-access-6qcgc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:45 crc kubenswrapper[4865]: I1205 06:25:45.683610 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:46 crc kubenswrapper[4865]: I1205 06:25:46.179544 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2"] Dec 05 06:25:46 crc kubenswrapper[4865]: W1205 06:25:46.188725 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3b4e4ee_2945_4c62_97e2_c561996ed302.slice/crio-4b9745664378ec97e9714274a8d66d686284f46212d8bc70f607a02dd095705a WatchSource:0}: Error finding container 4b9745664378ec97e9714274a8d66d686284f46212d8bc70f607a02dd095705a: Status 404 returned error can't find the container with id 4b9745664378ec97e9714274a8d66d686284f46212d8bc70f607a02dd095705a Dec 05 06:25:46 crc kubenswrapper[4865]: I1205 06:25:46.247459 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" event={"ID":"f3b4e4ee-2945-4c62-97e2-c561996ed302","Type":"ContainerStarted","Data":"4b9745664378ec97e9714274a8d66d686284f46212d8bc70f607a02dd095705a"} Dec 05 06:25:47 crc kubenswrapper[4865]: I1205 06:25:47.260896 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" event={"ID":"f3b4e4ee-2945-4c62-97e2-c561996ed302","Type":"ContainerStarted","Data":"05796ddda91189c25ba805a5cf305bec3d7d5b6727a84abd4322efc3b85e43f0"} Dec 05 06:25:47 crc kubenswrapper[4865]: I1205 06:25:47.283791 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" podStartSLOduration=1.713259546 podStartE2EDuration="2.283773949s" podCreationTimestamp="2025-12-05 06:25:45 +0000 UTC" firstStartedPulling="2025-12-05 06:25:46.191802474 +0000 UTC m=+1965.471813726" lastFinishedPulling="2025-12-05 06:25:46.762316887 +0000 UTC m=+1966.042328129" observedRunningTime="2025-12-05 06:25:47.282587475 +0000 UTC m=+1966.562598707" watchObservedRunningTime="2025-12-05 06:25:47.283773949 +0000 UTC m=+1966.563785171" Dec 05 06:25:53 crc kubenswrapper[4865]: I1205 06:25:53.314048 4865 generic.go:334] "Generic (PLEG): container finished" podID="f3b4e4ee-2945-4c62-97e2-c561996ed302" containerID="05796ddda91189c25ba805a5cf305bec3d7d5b6727a84abd4322efc3b85e43f0" exitCode=0 Dec 05 06:25:53 crc kubenswrapper[4865]: I1205 06:25:53.314228 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" event={"ID":"f3b4e4ee-2945-4c62-97e2-c561996ed302","Type":"ContainerDied","Data":"05796ddda91189c25ba805a5cf305bec3d7d5b6727a84abd4322efc3b85e43f0"} Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.801673 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.915969 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-ssh-key\") pod \"f3b4e4ee-2945-4c62-97e2-c561996ed302\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.916102 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-inventory\") pod \"f3b4e4ee-2945-4c62-97e2-c561996ed302\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.916129 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qcgc\" (UniqueName: \"kubernetes.io/projected/f3b4e4ee-2945-4c62-97e2-c561996ed302-kube-api-access-6qcgc\") pod \"f3b4e4ee-2945-4c62-97e2-c561996ed302\" (UID: \"f3b4e4ee-2945-4c62-97e2-c561996ed302\") " Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.926193 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b4e4ee-2945-4c62-97e2-c561996ed302-kube-api-access-6qcgc" (OuterVolumeSpecName: "kube-api-access-6qcgc") pod "f3b4e4ee-2945-4c62-97e2-c561996ed302" (UID: "f3b4e4ee-2945-4c62-97e2-c561996ed302"). InnerVolumeSpecName "kube-api-access-6qcgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.952250 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-inventory" (OuterVolumeSpecName: "inventory") pod "f3b4e4ee-2945-4c62-97e2-c561996ed302" (UID: "f3b4e4ee-2945-4c62-97e2-c561996ed302"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:25:54 crc kubenswrapper[4865]: I1205 06:25:54.953556 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3b4e4ee-2945-4c62-97e2-c561996ed302" (UID: "f3b4e4ee-2945-4c62-97e2-c561996ed302"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.019316 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.019388 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b4e4ee-2945-4c62-97e2-c561996ed302-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.019406 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qcgc\" (UniqueName: \"kubernetes.io/projected/f3b4e4ee-2945-4c62-97e2-c561996ed302-kube-api-access-6qcgc\") on node \"crc\" DevicePath \"\"" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.342376 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" event={"ID":"f3b4e4ee-2945-4c62-97e2-c561996ed302","Type":"ContainerDied","Data":"4b9745664378ec97e9714274a8d66d686284f46212d8bc70f607a02dd095705a"} Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.342439 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b9745664378ec97e9714274a8d66d686284f46212d8bc70f607a02dd095705a" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.342534 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.458674 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp"] Dec 05 06:25:55 crc kubenswrapper[4865]: E1205 06:25:55.459541 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b4e4ee-2945-4c62-97e2-c561996ed302" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.459604 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b4e4ee-2945-4c62-97e2-c561996ed302" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.459945 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b4e4ee-2945-4c62-97e2-c561996ed302" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.461174 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.465265 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.465329 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.465554 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.465656 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.485345 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp"] Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.634700 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbg2x\" (UniqueName: \"kubernetes.io/projected/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-kube-api-access-lbg2x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.635282 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.635554 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.738125 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbg2x\" (UniqueName: \"kubernetes.io/projected/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-kube-api-access-lbg2x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.738390 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.738761 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.745571 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.749624 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.762434 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbg2x\" (UniqueName: \"kubernetes.io/projected/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-kube-api-access-lbg2x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pdpbp\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:55 crc kubenswrapper[4865]: I1205 06:25:55.797125 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:25:56 crc kubenswrapper[4865]: I1205 06:25:56.376157 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp"] Dec 05 06:25:57 crc kubenswrapper[4865]: I1205 06:25:57.365421 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" event={"ID":"65d6bbea-eb81-4cfb-ba7c-e56d423884f8","Type":"ContainerStarted","Data":"975b175dc8eb23ce1c091aaf1147925479f722f60a82d6a04947a08caa9e85fe"} Dec 05 06:25:57 crc kubenswrapper[4865]: I1205 06:25:57.366571 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" event={"ID":"65d6bbea-eb81-4cfb-ba7c-e56d423884f8","Type":"ContainerStarted","Data":"760f6de16e46eb77c75810856cbf4adea073714998fa62bb1803774bfdc8dfd5"} Dec 05 06:25:57 crc kubenswrapper[4865]: I1205 06:25:57.398141 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" podStartSLOduration=2.021707828 podStartE2EDuration="2.398109835s" podCreationTimestamp="2025-12-05 06:25:55 +0000 UTC" firstStartedPulling="2025-12-05 06:25:56.386976143 +0000 UTC m=+1975.666987365" lastFinishedPulling="2025-12-05 06:25:56.76337816 +0000 UTC m=+1976.043389372" observedRunningTime="2025-12-05 06:25:57.392510596 +0000 UTC m=+1976.672521818" watchObservedRunningTime="2025-12-05 06:25:57.398109835 +0000 UTC m=+1976.678121047" Dec 05 06:26:11 crc kubenswrapper[4865]: I1205 06:26:11.050039 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:26:11 crc kubenswrapper[4865]: I1205 06:26:11.051176 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.354903 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fgp8x"] Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.357779 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.387263 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fgp8x"] Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.465488 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjrc\" (UniqueName: \"kubernetes.io/projected/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-kube-api-access-2pjrc\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.465570 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-catalog-content\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.466089 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-utilities\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.567882 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjrc\" (UniqueName: \"kubernetes.io/projected/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-kube-api-access-2pjrc\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.567976 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-catalog-content\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.568081 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-utilities\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.568634 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-catalog-content\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.568677 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-utilities\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.615380 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjrc\" (UniqueName: \"kubernetes.io/projected/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-kube-api-access-2pjrc\") pod \"redhat-operators-fgp8x\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:26 crc kubenswrapper[4865]: I1205 06:26:26.678756 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:27 crc kubenswrapper[4865]: I1205 06:26:27.169064 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fgp8x"] Dec 05 06:26:27 crc kubenswrapper[4865]: I1205 06:26:27.699166 4865 generic.go:334] "Generic (PLEG): container finished" podID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerID="9907736d7c1f0799b16f5c3cce76f16e0973e04c8155653fc41de5771bd4ac23" exitCode=0 Dec 05 06:26:27 crc kubenswrapper[4865]: I1205 06:26:27.699702 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerDied","Data":"9907736d7c1f0799b16f5c3cce76f16e0973e04c8155653fc41de5771bd4ac23"} Dec 05 06:26:27 crc kubenswrapper[4865]: I1205 06:26:27.699735 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerStarted","Data":"6a5ee03c37e151c0d6d95b755c20876e4c28215f170276af814b7c3929b5ae76"} Dec 05 06:26:28 crc kubenswrapper[4865]: I1205 06:26:28.710996 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerStarted","Data":"09f528f268565c269e1ac5ba7b7a681ebe1f669556574d6a4fa6a329539fe06f"} Dec 05 06:26:32 crc kubenswrapper[4865]: I1205 06:26:32.745649 4865 generic.go:334] "Generic (PLEG): container finished" podID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerID="09f528f268565c269e1ac5ba7b7a681ebe1f669556574d6a4fa6a329539fe06f" exitCode=0 Dec 05 06:26:32 crc kubenswrapper[4865]: I1205 06:26:32.745717 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerDied","Data":"09f528f268565c269e1ac5ba7b7a681ebe1f669556574d6a4fa6a329539fe06f"} Dec 05 06:26:33 crc kubenswrapper[4865]: I1205 06:26:33.758073 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerStarted","Data":"552f0df78064765308b1734e5d24d82abf5db60068c4647d9526a814f96de28e"} Dec 05 06:26:33 crc kubenswrapper[4865]: I1205 06:26:33.788408 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fgp8x" podStartSLOduration=2.3118551 podStartE2EDuration="7.788385683s" podCreationTimestamp="2025-12-05 06:26:26 +0000 UTC" firstStartedPulling="2025-12-05 06:26:27.702343871 +0000 UTC m=+2006.982355093" lastFinishedPulling="2025-12-05 06:26:33.178874454 +0000 UTC m=+2012.458885676" observedRunningTime="2025-12-05 06:26:33.784276336 +0000 UTC m=+2013.064287558" watchObservedRunningTime="2025-12-05 06:26:33.788385683 +0000 UTC m=+2013.068396915" Dec 05 06:26:36 crc kubenswrapper[4865]: I1205 06:26:36.679616 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:36 crc kubenswrapper[4865]: I1205 06:26:36.680860 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:37 crc kubenswrapper[4865]: I1205 06:26:37.748698 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fgp8x" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="registry-server" probeResult="failure" output=< Dec 05 06:26:37 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:26:37 crc kubenswrapper[4865]: > Dec 05 06:26:41 crc kubenswrapper[4865]: I1205 06:26:41.049218 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:26:41 crc kubenswrapper[4865]: I1205 06:26:41.049731 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:26:45 crc kubenswrapper[4865]: I1205 06:26:45.869297 4865 generic.go:334] "Generic (PLEG): container finished" podID="65d6bbea-eb81-4cfb-ba7c-e56d423884f8" containerID="975b175dc8eb23ce1c091aaf1147925479f722f60a82d6a04947a08caa9e85fe" exitCode=0 Dec 05 06:26:45 crc kubenswrapper[4865]: I1205 06:26:45.869386 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" event={"ID":"65d6bbea-eb81-4cfb-ba7c-e56d423884f8","Type":"ContainerDied","Data":"975b175dc8eb23ce1c091aaf1147925479f722f60a82d6a04947a08caa9e85fe"} Dec 05 06:26:46 crc kubenswrapper[4865]: I1205 06:26:46.724405 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:46 crc kubenswrapper[4865]: I1205 06:26:46.777024 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:46 crc kubenswrapper[4865]: I1205 06:26:46.972402 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fgp8x"] Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.382950 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.462129 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-ssh-key\") pod \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.462198 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbg2x\" (UniqueName: \"kubernetes.io/projected/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-kube-api-access-lbg2x\") pod \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.462445 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-inventory\") pod \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\" (UID: \"65d6bbea-eb81-4cfb-ba7c-e56d423884f8\") " Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.485913 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-kube-api-access-lbg2x" (OuterVolumeSpecName: "kube-api-access-lbg2x") pod "65d6bbea-eb81-4cfb-ba7c-e56d423884f8" (UID: "65d6bbea-eb81-4cfb-ba7c-e56d423884f8"). InnerVolumeSpecName "kube-api-access-lbg2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.526283 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65d6bbea-eb81-4cfb-ba7c-e56d423884f8" (UID: "65d6bbea-eb81-4cfb-ba7c-e56d423884f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.535570 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-inventory" (OuterVolumeSpecName: "inventory") pod "65d6bbea-eb81-4cfb-ba7c-e56d423884f8" (UID: "65d6bbea-eb81-4cfb-ba7c-e56d423884f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.564816 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.564860 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.564872 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbg2x\" (UniqueName: \"kubernetes.io/projected/65d6bbea-eb81-4cfb-ba7c-e56d423884f8-kube-api-access-lbg2x\") on node \"crc\" DevicePath \"\"" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.892332 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" event={"ID":"65d6bbea-eb81-4cfb-ba7c-e56d423884f8","Type":"ContainerDied","Data":"760f6de16e46eb77c75810856cbf4adea073714998fa62bb1803774bfdc8dfd5"} Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.892382 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pdpbp" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.892394 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760f6de16e46eb77c75810856cbf4adea073714998fa62bb1803774bfdc8dfd5" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.892677 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fgp8x" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="registry-server" containerID="cri-o://552f0df78064765308b1734e5d24d82abf5db60068c4647d9526a814f96de28e" gracePeriod=2 Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.983151 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt"] Dec 05 06:26:47 crc kubenswrapper[4865]: E1205 06:26:47.985045 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d6bbea-eb81-4cfb-ba7c-e56d423884f8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.985077 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d6bbea-eb81-4cfb-ba7c-e56d423884f8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.985359 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d6bbea-eb81-4cfb-ba7c-e56d423884f8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.986190 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.990705 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.990767 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.990930 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.990962 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:26:47 crc kubenswrapper[4865]: I1205 06:26:47.992843 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt"] Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.074435 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.074487 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.074634 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88lnt\" (UniqueName: \"kubernetes.io/projected/f01b2a46-843f-4022-ac72-af49312bbcc8-kube-api-access-88lnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.176075 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.176144 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.176245 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88lnt\" (UniqueName: \"kubernetes.io/projected/f01b2a46-843f-4022-ac72-af49312bbcc8-kube-api-access-88lnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.179786 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.181551 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.193193 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88lnt\" (UniqueName: \"kubernetes.io/projected/f01b2a46-843f-4022-ac72-af49312bbcc8-kube-api-access-88lnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.300359 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.861379 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt"] Dec 05 06:26:48 crc kubenswrapper[4865]: W1205 06:26:48.867617 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b2a46_843f_4022_ac72_af49312bbcc8.slice/crio-f0a05d4b1ef4d11bba30bce985ba4d5303cb662bd0de8b5b866d21cf59bf47de WatchSource:0}: Error finding container f0a05d4b1ef4d11bba30bce985ba4d5303cb662bd0de8b5b866d21cf59bf47de: Status 404 returned error can't find the container with id f0a05d4b1ef4d11bba30bce985ba4d5303cb662bd0de8b5b866d21cf59bf47de Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.902222 4865 generic.go:334] "Generic (PLEG): container finished" podID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerID="552f0df78064765308b1734e5d24d82abf5db60068c4647d9526a814f96de28e" exitCode=0 Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.902331 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerDied","Data":"552f0df78064765308b1734e5d24d82abf5db60068c4647d9526a814f96de28e"} Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.903448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" event={"ID":"f01b2a46-843f-4022-ac72-af49312bbcc8","Type":"ContainerStarted","Data":"f0a05d4b1ef4d11bba30bce985ba4d5303cb662bd0de8b5b866d21cf59bf47de"} Dec 05 06:26:48 crc kubenswrapper[4865]: I1205 06:26:48.973415 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.092235 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pjrc\" (UniqueName: \"kubernetes.io/projected/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-kube-api-access-2pjrc\") pod \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.093615 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-utilities\") pod \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.093884 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-catalog-content\") pod \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\" (UID: \"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a\") " Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.094440 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-utilities" (OuterVolumeSpecName: "utilities") pod "4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" (UID: "4a32ce78-44d5-4d70-a9ca-d75a17a26a7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.094744 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.101111 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-kube-api-access-2pjrc" (OuterVolumeSpecName: "kube-api-access-2pjrc") pod "4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" (UID: "4a32ce78-44d5-4d70-a9ca-d75a17a26a7a"). InnerVolumeSpecName "kube-api-access-2pjrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.197468 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pjrc\" (UniqueName: \"kubernetes.io/projected/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-kube-api-access-2pjrc\") on node \"crc\" DevicePath \"\"" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.206581 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" (UID: "4a32ce78-44d5-4d70-a9ca-d75a17a26a7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.299452 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.918680 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" event={"ID":"f01b2a46-843f-4022-ac72-af49312bbcc8","Type":"ContainerStarted","Data":"217ef29b39166c76d818de3d2b3fd83a3d01d7c16fe8919b8079318a632dc8ef"} Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.923875 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp8x" event={"ID":"4a32ce78-44d5-4d70-a9ca-d75a17a26a7a","Type":"ContainerDied","Data":"6a5ee03c37e151c0d6d95b755c20876e4c28215f170276af814b7c3929b5ae76"} Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.923929 4865 scope.go:117] "RemoveContainer" containerID="552f0df78064765308b1734e5d24d82abf5db60068c4647d9526a814f96de28e" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.924075 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp8x" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.944972 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" podStartSLOduration=2.409269202 podStartE2EDuration="2.944947827s" podCreationTimestamp="2025-12-05 06:26:47 +0000 UTC" firstStartedPulling="2025-12-05 06:26:48.87010089 +0000 UTC m=+2028.150112112" lastFinishedPulling="2025-12-05 06:26:49.405779515 +0000 UTC m=+2028.685790737" observedRunningTime="2025-12-05 06:26:49.937205258 +0000 UTC m=+2029.217216490" watchObservedRunningTime="2025-12-05 06:26:49.944947827 +0000 UTC m=+2029.224959059" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.971220 4865 scope.go:117] "RemoveContainer" containerID="09f528f268565c269e1ac5ba7b7a681ebe1f669556574d6a4fa6a329539fe06f" Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.975619 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fgp8x"] Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.986880 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fgp8x"] Dec 05 06:26:49 crc kubenswrapper[4865]: I1205 06:26:49.999727 4865 scope.go:117] "RemoveContainer" containerID="9907736d7c1f0799b16f5c3cce76f16e0973e04c8155653fc41de5771bd4ac23" Dec 05 06:26:51 crc kubenswrapper[4865]: I1205 06:26:51.017592 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" path="/var/lib/kubelet/pods/4a32ce78-44d5-4d70-a9ca-d75a17a26a7a/volumes" Dec 05 06:27:11 crc kubenswrapper[4865]: I1205 06:27:11.049520 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:27:11 crc kubenswrapper[4865]: I1205 06:27:11.049970 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:27:11 crc kubenswrapper[4865]: I1205 06:27:11.050004 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:27:11 crc kubenswrapper[4865]: I1205 06:27:11.050517 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8298b19a58cec01e49fca2a020c44af4ff6818830b5baef2bd67270b8780a994"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:27:11 crc kubenswrapper[4865]: I1205 06:27:11.050559 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://8298b19a58cec01e49fca2a020c44af4ff6818830b5baef2bd67270b8780a994" gracePeriod=600 Dec 05 06:27:12 crc kubenswrapper[4865]: I1205 06:27:12.151361 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="8298b19a58cec01e49fca2a020c44af4ff6818830b5baef2bd67270b8780a994" exitCode=0 Dec 05 06:27:12 crc kubenswrapper[4865]: I1205 06:27:12.151416 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"8298b19a58cec01e49fca2a020c44af4ff6818830b5baef2bd67270b8780a994"} Dec 05 06:27:12 crc kubenswrapper[4865]: I1205 06:27:12.151947 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a"} Dec 05 06:27:12 crc kubenswrapper[4865]: I1205 06:27:12.151973 4865 scope.go:117] "RemoveContainer" containerID="92fc7a504ad105a11a6bebf9cf3da4ff9cb9bcf24d92c815321b9d76b159439e" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.409366 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dpmbv"] Dec 05 06:27:36 crc kubenswrapper[4865]: E1205 06:27:36.410872 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="extract-utilities" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.410894 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="extract-utilities" Dec 05 06:27:36 crc kubenswrapper[4865]: E1205 06:27:36.410922 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="extract-content" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.410933 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="extract-content" Dec 05 06:27:36 crc kubenswrapper[4865]: E1205 06:27:36.410983 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="registry-server" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.410999 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="registry-server" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.411287 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a32ce78-44d5-4d70-a9ca-d75a17a26a7a" containerName="registry-server" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.414449 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.430300 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dpmbv"] Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.519180 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lkfh\" (UniqueName: \"kubernetes.io/projected/9c96356d-d787-4411-84e9-811b82aacc14-kube-api-access-4lkfh\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.519501 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-utilities\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.519761 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-catalog-content\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.621968 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-catalog-content\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.622110 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lkfh\" (UniqueName: \"kubernetes.io/projected/9c96356d-d787-4411-84e9-811b82aacc14-kube-api-access-4lkfh\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.622230 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-utilities\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.622610 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-catalog-content\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.622702 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-utilities\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.653521 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lkfh\" (UniqueName: \"kubernetes.io/projected/9c96356d-d787-4411-84e9-811b82aacc14-kube-api-access-4lkfh\") pod \"certified-operators-dpmbv\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:36 crc kubenswrapper[4865]: I1205 06:27:36.735504 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:37 crc kubenswrapper[4865]: I1205 06:27:37.275393 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dpmbv"] Dec 05 06:27:37 crc kubenswrapper[4865]: I1205 06:27:37.388192 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerStarted","Data":"f6db3db8c4df9398a363f41c744a297ce78e8a486a390c52660fb7ca2962625b"} Dec 05 06:27:38 crc kubenswrapper[4865]: I1205 06:27:38.396940 4865 generic.go:334] "Generic (PLEG): container finished" podID="9c96356d-d787-4411-84e9-811b82aacc14" containerID="c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5" exitCode=0 Dec 05 06:27:38 crc kubenswrapper[4865]: I1205 06:27:38.397023 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerDied","Data":"c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5"} Dec 05 06:27:38 crc kubenswrapper[4865]: I1205 06:27:38.400190 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:27:39 crc kubenswrapper[4865]: I1205 06:27:39.407685 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerStarted","Data":"9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801"} Dec 05 06:27:40 crc kubenswrapper[4865]: I1205 06:27:40.432143 4865 generic.go:334] "Generic (PLEG): container finished" podID="9c96356d-d787-4411-84e9-811b82aacc14" containerID="9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801" exitCode=0 Dec 05 06:27:40 crc kubenswrapper[4865]: I1205 06:27:40.432201 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerDied","Data":"9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801"} Dec 05 06:27:41 crc kubenswrapper[4865]: I1205 06:27:41.447502 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerStarted","Data":"56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d"} Dec 05 06:27:41 crc kubenswrapper[4865]: I1205 06:27:41.486943 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dpmbv" podStartSLOduration=3.064870286 podStartE2EDuration="5.486916043s" podCreationTimestamp="2025-12-05 06:27:36 +0000 UTC" firstStartedPulling="2025-12-05 06:27:38.399690373 +0000 UTC m=+2077.679701595" lastFinishedPulling="2025-12-05 06:27:40.82173611 +0000 UTC m=+2080.101747352" observedRunningTime="2025-12-05 06:27:41.472392521 +0000 UTC m=+2080.752403773" watchObservedRunningTime="2025-12-05 06:27:41.486916043 +0000 UTC m=+2080.766927295" Dec 05 06:27:46 crc kubenswrapper[4865]: I1205 06:27:46.736492 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:46 crc kubenswrapper[4865]: I1205 06:27:46.737479 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:46 crc kubenswrapper[4865]: I1205 06:27:46.785671 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:47 crc kubenswrapper[4865]: I1205 06:27:47.559666 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:47 crc kubenswrapper[4865]: I1205 06:27:47.609636 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dpmbv"] Dec 05 06:27:49 crc kubenswrapper[4865]: I1205 06:27:49.521201 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dpmbv" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="registry-server" containerID="cri-o://56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d" gracePeriod=2 Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.074399 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.213717 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lkfh\" (UniqueName: \"kubernetes.io/projected/9c96356d-d787-4411-84e9-811b82aacc14-kube-api-access-4lkfh\") pod \"9c96356d-d787-4411-84e9-811b82aacc14\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.213809 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-catalog-content\") pod \"9c96356d-d787-4411-84e9-811b82aacc14\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.213935 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-utilities\") pod \"9c96356d-d787-4411-84e9-811b82aacc14\" (UID: \"9c96356d-d787-4411-84e9-811b82aacc14\") " Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.214808 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-utilities" (OuterVolumeSpecName: "utilities") pod "9c96356d-d787-4411-84e9-811b82aacc14" (UID: "9c96356d-d787-4411-84e9-811b82aacc14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.215934 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.225007 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c96356d-d787-4411-84e9-811b82aacc14-kube-api-access-4lkfh" (OuterVolumeSpecName: "kube-api-access-4lkfh") pod "9c96356d-d787-4411-84e9-811b82aacc14" (UID: "9c96356d-d787-4411-84e9-811b82aacc14"). InnerVolumeSpecName "kube-api-access-4lkfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.269513 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c96356d-d787-4411-84e9-811b82aacc14" (UID: "9c96356d-d787-4411-84e9-811b82aacc14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.318122 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c96356d-d787-4411-84e9-811b82aacc14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.318157 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lkfh\" (UniqueName: \"kubernetes.io/projected/9c96356d-d787-4411-84e9-811b82aacc14-kube-api-access-4lkfh\") on node \"crc\" DevicePath \"\"" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.532441 4865 generic.go:334] "Generic (PLEG): container finished" podID="9c96356d-d787-4411-84e9-811b82aacc14" containerID="56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d" exitCode=0 Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.532479 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerDied","Data":"56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d"} Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.532503 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpmbv" event={"ID":"9c96356d-d787-4411-84e9-811b82aacc14","Type":"ContainerDied","Data":"f6db3db8c4df9398a363f41c744a297ce78e8a486a390c52660fb7ca2962625b"} Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.532518 4865 scope.go:117] "RemoveContainer" containerID="56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.532654 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dpmbv" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.556785 4865 scope.go:117] "RemoveContainer" containerID="9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.586891 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dpmbv"] Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.593102 4865 scope.go:117] "RemoveContainer" containerID="c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.594231 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dpmbv"] Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.636480 4865 scope.go:117] "RemoveContainer" containerID="56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d" Dec 05 06:27:50 crc kubenswrapper[4865]: E1205 06:27:50.637328 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d\": container with ID starting with 56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d not found: ID does not exist" containerID="56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.637371 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d"} err="failed to get container status \"56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d\": rpc error: code = NotFound desc = could not find container \"56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d\": container with ID starting with 56fc2944d530de12af3e144a61498fa9d80f0d3dc2362a262f1afc32a7679a6d not found: ID does not exist" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.637395 4865 scope.go:117] "RemoveContainer" containerID="9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801" Dec 05 06:27:50 crc kubenswrapper[4865]: E1205 06:27:50.637631 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801\": container with ID starting with 9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801 not found: ID does not exist" containerID="9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.637651 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801"} err="failed to get container status \"9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801\": rpc error: code = NotFound desc = could not find container \"9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801\": container with ID starting with 9eb3ee97c663d33d2d8acc7be8f5d15ef10bedb6c9afbb4f8d558b63bec81801 not found: ID does not exist" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.637670 4865 scope.go:117] "RemoveContainer" containerID="c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5" Dec 05 06:27:50 crc kubenswrapper[4865]: E1205 06:27:50.637902 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5\": container with ID starting with c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5 not found: ID does not exist" containerID="c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5" Dec 05 06:27:50 crc kubenswrapper[4865]: I1205 06:27:50.637927 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5"} err="failed to get container status \"c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5\": rpc error: code = NotFound desc = could not find container \"c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5\": container with ID starting with c39d841ad1bb6ed952593ccc16eefc9b13247585e41aa658bbdb0b704e1ad1e5 not found: ID does not exist" Dec 05 06:27:51 crc kubenswrapper[4865]: I1205 06:27:51.042862 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c96356d-d787-4411-84e9-811b82aacc14" path="/var/lib/kubelet/pods/9c96356d-d787-4411-84e9-811b82aacc14/volumes" Dec 05 06:27:51 crc kubenswrapper[4865]: I1205 06:27:51.545037 4865 generic.go:334] "Generic (PLEG): container finished" podID="f01b2a46-843f-4022-ac72-af49312bbcc8" containerID="217ef29b39166c76d818de3d2b3fd83a3d01d7c16fe8919b8079318a632dc8ef" exitCode=0 Dec 05 06:27:51 crc kubenswrapper[4865]: I1205 06:27:51.545121 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" event={"ID":"f01b2a46-843f-4022-ac72-af49312bbcc8","Type":"ContainerDied","Data":"217ef29b39166c76d818de3d2b3fd83a3d01d7c16fe8919b8079318a632dc8ef"} Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.075416 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.195745 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88lnt\" (UniqueName: \"kubernetes.io/projected/f01b2a46-843f-4022-ac72-af49312bbcc8-kube-api-access-88lnt\") pod \"f01b2a46-843f-4022-ac72-af49312bbcc8\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.195843 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-ssh-key\") pod \"f01b2a46-843f-4022-ac72-af49312bbcc8\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.195865 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-inventory\") pod \"f01b2a46-843f-4022-ac72-af49312bbcc8\" (UID: \"f01b2a46-843f-4022-ac72-af49312bbcc8\") " Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.203247 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01b2a46-843f-4022-ac72-af49312bbcc8-kube-api-access-88lnt" (OuterVolumeSpecName: "kube-api-access-88lnt") pod "f01b2a46-843f-4022-ac72-af49312bbcc8" (UID: "f01b2a46-843f-4022-ac72-af49312bbcc8"). InnerVolumeSpecName "kube-api-access-88lnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.225490 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-inventory" (OuterVolumeSpecName: "inventory") pod "f01b2a46-843f-4022-ac72-af49312bbcc8" (UID: "f01b2a46-843f-4022-ac72-af49312bbcc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.230995 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f01b2a46-843f-4022-ac72-af49312bbcc8" (UID: "f01b2a46-843f-4022-ac72-af49312bbcc8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.298563 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88lnt\" (UniqueName: \"kubernetes.io/projected/f01b2a46-843f-4022-ac72-af49312bbcc8-kube-api-access-88lnt\") on node \"crc\" DevicePath \"\"" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.298597 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.298609 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f01b2a46-843f-4022-ac72-af49312bbcc8-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.572725 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" event={"ID":"f01b2a46-843f-4022-ac72-af49312bbcc8","Type":"ContainerDied","Data":"f0a05d4b1ef4d11bba30bce985ba4d5303cb662bd0de8b5b866d21cf59bf47de"} Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.572770 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0a05d4b1ef4d11bba30bce985ba4d5303cb662bd0de8b5b866d21cf59bf47de" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.572816 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.677622 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j7bbd"] Dec 05 06:27:53 crc kubenswrapper[4865]: E1205 06:27:53.678391 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01b2a46-843f-4022-ac72-af49312bbcc8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.678495 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01b2a46-843f-4022-ac72-af49312bbcc8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:27:53 crc kubenswrapper[4865]: E1205 06:27:53.678575 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="registry-server" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.678661 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="registry-server" Dec 05 06:27:53 crc kubenswrapper[4865]: E1205 06:27:53.678756 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="extract-utilities" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.678867 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="extract-utilities" Dec 05 06:27:53 crc kubenswrapper[4865]: E1205 06:27:53.678979 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="extract-content" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.679055 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="extract-content" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.679372 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01b2a46-843f-4022-ac72-af49312bbcc8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.679486 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c96356d-d787-4411-84e9-811b82aacc14" containerName="registry-server" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.680350 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.682868 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.683247 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.685405 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.685550 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.691846 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j7bbd"] Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.721517 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.721779 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42h8\" (UniqueName: \"kubernetes.io/projected/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-kube-api-access-s42h8\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.721916 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.823158 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.823439 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42h8\" (UniqueName: \"kubernetes.io/projected/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-kube-api-access-s42h8\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.823517 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.832370 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.835622 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:53 crc kubenswrapper[4865]: I1205 06:27:53.847249 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42h8\" (UniqueName: \"kubernetes.io/projected/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-kube-api-access-s42h8\") pod \"ssh-known-hosts-edpm-deployment-j7bbd\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:54 crc kubenswrapper[4865]: I1205 06:27:54.047741 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:27:54 crc kubenswrapper[4865]: I1205 06:27:54.641521 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j7bbd"] Dec 05 06:27:54 crc kubenswrapper[4865]: W1205 06:27:54.649300 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c8aa8a3_8378_4a97_af3f_2b59ad1d2a0b.slice/crio-746a9346b90ba4d35ba27449dd256d83c9f51ab3acd7fd71ea4c1912e2ed92af WatchSource:0}: Error finding container 746a9346b90ba4d35ba27449dd256d83c9f51ab3acd7fd71ea4c1912e2ed92af: Status 404 returned error can't find the container with id 746a9346b90ba4d35ba27449dd256d83c9f51ab3acd7fd71ea4c1912e2ed92af Dec 05 06:27:55 crc kubenswrapper[4865]: I1205 06:27:55.594062 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" event={"ID":"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b","Type":"ContainerStarted","Data":"54a08fcaab3d5a265e442d7e35cbe7ffaed7424d55225ac79e450d7e4f379c57"} Dec 05 06:27:55 crc kubenswrapper[4865]: I1205 06:27:55.594626 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" event={"ID":"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b","Type":"ContainerStarted","Data":"746a9346b90ba4d35ba27449dd256d83c9f51ab3acd7fd71ea4c1912e2ed92af"} Dec 05 06:27:55 crc kubenswrapper[4865]: I1205 06:27:55.628580 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" podStartSLOduration=2.12035544 podStartE2EDuration="2.628554962s" podCreationTimestamp="2025-12-05 06:27:53 +0000 UTC" firstStartedPulling="2025-12-05 06:27:54.651175538 +0000 UTC m=+2093.931186760" lastFinishedPulling="2025-12-05 06:27:55.15937506 +0000 UTC m=+2094.439386282" observedRunningTime="2025-12-05 06:27:55.615128281 +0000 UTC m=+2094.895139533" watchObservedRunningTime="2025-12-05 06:27:55.628554962 +0000 UTC m=+2094.908566224" Dec 05 06:28:03 crc kubenswrapper[4865]: I1205 06:28:03.667450 4865 generic.go:334] "Generic (PLEG): container finished" podID="0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" containerID="54a08fcaab3d5a265e442d7e35cbe7ffaed7424d55225ac79e450d7e4f379c57" exitCode=0 Dec 05 06:28:03 crc kubenswrapper[4865]: I1205 06:28:03.667550 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" event={"ID":"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b","Type":"ContainerDied","Data":"54a08fcaab3d5a265e442d7e35cbe7ffaed7424d55225ac79e450d7e4f379c57"} Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.087064 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.152963 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-inventory-0\") pod \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.153447 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s42h8\" (UniqueName: \"kubernetes.io/projected/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-kube-api-access-s42h8\") pod \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.153525 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-ssh-key-openstack-edpm-ipam\") pod \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\" (UID: \"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b\") " Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.158848 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-kube-api-access-s42h8" (OuterVolumeSpecName: "kube-api-access-s42h8") pod "0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" (UID: "0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b"). InnerVolumeSpecName "kube-api-access-s42h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.182947 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" (UID: "0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.185709 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" (UID: "0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.255791 4865 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.255841 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s42h8\" (UniqueName: \"kubernetes.io/projected/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-kube-api-access-s42h8\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.255858 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.686653 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" event={"ID":"0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b","Type":"ContainerDied","Data":"746a9346b90ba4d35ba27449dd256d83c9f51ab3acd7fd71ea4c1912e2ed92af"} Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.686703 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="746a9346b90ba4d35ba27449dd256d83c9f51ab3acd7fd71ea4c1912e2ed92af" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.686742 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j7bbd" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.890941 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw"] Dec 05 06:28:05 crc kubenswrapper[4865]: E1205 06:28:05.894621 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" containerName="ssh-known-hosts-edpm-deployment" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.894645 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" containerName="ssh-known-hosts-edpm-deployment" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.895191 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b" containerName="ssh-known-hosts-edpm-deployment" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.896128 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.898996 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.899527 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.899709 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.899890 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:28:05 crc kubenswrapper[4865]: I1205 06:28:05.911495 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw"] Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.072012 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.072074 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsfm4\" (UniqueName: \"kubernetes.io/projected/c38b5b25-e372-4601-9b9d-6b9d883a6953-kube-api-access-rsfm4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.072133 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.174003 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.174067 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsfm4\" (UniqueName: \"kubernetes.io/projected/c38b5b25-e372-4601-9b9d-6b9d883a6953-kube-api-access-rsfm4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.174114 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.179219 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.179387 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.200199 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsfm4\" (UniqueName: \"kubernetes.io/projected/c38b5b25-e372-4601-9b9d-6b9d883a6953-kube-api-access-rsfm4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjvnw\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.228342 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:06 crc kubenswrapper[4865]: I1205 06:28:06.810378 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw"] Dec 05 06:28:07 crc kubenswrapper[4865]: I1205 06:28:07.709694 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" event={"ID":"c38b5b25-e372-4601-9b9d-6b9d883a6953","Type":"ContainerStarted","Data":"ffb7782ff61268a19ec06c9d8db7a0dc38301e29289f00d9d0015830cee78d72"} Dec 05 06:28:08 crc kubenswrapper[4865]: I1205 06:28:08.727781 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" event={"ID":"c38b5b25-e372-4601-9b9d-6b9d883a6953","Type":"ContainerStarted","Data":"c174e29e95ce17c77660ab018c73710a660c4e46127c3e84359ab9a351e3a669"} Dec 05 06:28:08 crc kubenswrapper[4865]: I1205 06:28:08.766335 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" podStartSLOduration=3.067103162 podStartE2EDuration="3.766306952s" podCreationTimestamp="2025-12-05 06:28:05 +0000 UTC" firstStartedPulling="2025-12-05 06:28:06.819803908 +0000 UTC m=+2106.099815130" lastFinishedPulling="2025-12-05 06:28:07.519007698 +0000 UTC m=+2106.799018920" observedRunningTime="2025-12-05 06:28:08.751881922 +0000 UTC m=+2108.031893144" watchObservedRunningTime="2025-12-05 06:28:08.766306952 +0000 UTC m=+2108.046318214" Dec 05 06:28:17 crc kubenswrapper[4865]: I1205 06:28:17.815521 4865 generic.go:334] "Generic (PLEG): container finished" podID="c38b5b25-e372-4601-9b9d-6b9d883a6953" containerID="c174e29e95ce17c77660ab018c73710a660c4e46127c3e84359ab9a351e3a669" exitCode=0 Dec 05 06:28:17 crc kubenswrapper[4865]: I1205 06:28:17.816197 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" event={"ID":"c38b5b25-e372-4601-9b9d-6b9d883a6953","Type":"ContainerDied","Data":"c174e29e95ce17c77660ab018c73710a660c4e46127c3e84359ab9a351e3a669"} Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.350680 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.434320 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsfm4\" (UniqueName: \"kubernetes.io/projected/c38b5b25-e372-4601-9b9d-6b9d883a6953-kube-api-access-rsfm4\") pod \"c38b5b25-e372-4601-9b9d-6b9d883a6953\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.434520 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-inventory\") pod \"c38b5b25-e372-4601-9b9d-6b9d883a6953\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.434663 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-ssh-key\") pod \"c38b5b25-e372-4601-9b9d-6b9d883a6953\" (UID: \"c38b5b25-e372-4601-9b9d-6b9d883a6953\") " Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.439934 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38b5b25-e372-4601-9b9d-6b9d883a6953-kube-api-access-rsfm4" (OuterVolumeSpecName: "kube-api-access-rsfm4") pod "c38b5b25-e372-4601-9b9d-6b9d883a6953" (UID: "c38b5b25-e372-4601-9b9d-6b9d883a6953"). InnerVolumeSpecName "kube-api-access-rsfm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.468168 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c38b5b25-e372-4601-9b9d-6b9d883a6953" (UID: "c38b5b25-e372-4601-9b9d-6b9d883a6953"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.474693 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-inventory" (OuterVolumeSpecName: "inventory") pod "c38b5b25-e372-4601-9b9d-6b9d883a6953" (UID: "c38b5b25-e372-4601-9b9d-6b9d883a6953"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.537209 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.537242 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c38b5b25-e372-4601-9b9d-6b9d883a6953-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.537253 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsfm4\" (UniqueName: \"kubernetes.io/projected/c38b5b25-e372-4601-9b9d-6b9d883a6953-kube-api-access-rsfm4\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.838639 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" event={"ID":"c38b5b25-e372-4601-9b9d-6b9d883a6953","Type":"ContainerDied","Data":"ffb7782ff61268a19ec06c9d8db7a0dc38301e29289f00d9d0015830cee78d72"} Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.838703 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjvnw" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.838705 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb7782ff61268a19ec06c9d8db7a0dc38301e29289f00d9d0015830cee78d72" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.954164 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx"] Dec 05 06:28:19 crc kubenswrapper[4865]: E1205 06:28:19.954689 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b5b25-e372-4601-9b9d-6b9d883a6953" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.954714 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b5b25-e372-4601-9b9d-6b9d883a6953" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.954999 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38b5b25-e372-4601-9b9d-6b9d883a6953" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.955864 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.960182 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.960435 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.960639 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.960664 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:28:19 crc kubenswrapper[4865]: I1205 06:28:19.969525 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx"] Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.047164 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhqm\" (UniqueName: \"kubernetes.io/projected/4bbde20d-cc33-4f77-857e-41bb96a20fe9-kube-api-access-bmhqm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.047575 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.047645 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.148925 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.148997 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.149078 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhqm\" (UniqueName: \"kubernetes.io/projected/4bbde20d-cc33-4f77-857e-41bb96a20fe9-kube-api-access-bmhqm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.154166 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.158399 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.166309 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhqm\" (UniqueName: \"kubernetes.io/projected/4bbde20d-cc33-4f77-857e-41bb96a20fe9-kube-api-access-bmhqm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.289700 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:20 crc kubenswrapper[4865]: I1205 06:28:20.840691 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx"] Dec 05 06:28:21 crc kubenswrapper[4865]: I1205 06:28:21.862851 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" event={"ID":"4bbde20d-cc33-4f77-857e-41bb96a20fe9","Type":"ContainerStarted","Data":"9e5c116dee6c0230f4214c702f4ad9280623fde3a62a12a4aa79003ee455ebee"} Dec 05 06:28:21 crc kubenswrapper[4865]: I1205 06:28:21.862905 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" event={"ID":"4bbde20d-cc33-4f77-857e-41bb96a20fe9","Type":"ContainerStarted","Data":"213364142be931a670e20294233dbe07aa8d12e245bb5c7e784f0f1ed0b028e7"} Dec 05 06:28:21 crc kubenswrapper[4865]: I1205 06:28:21.885054 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" podStartSLOduration=2.462957066 podStartE2EDuration="2.884920268s" podCreationTimestamp="2025-12-05 06:28:19 +0000 UTC" firstStartedPulling="2025-12-05 06:28:20.876919893 +0000 UTC m=+2120.156931115" lastFinishedPulling="2025-12-05 06:28:21.298883075 +0000 UTC m=+2120.578894317" observedRunningTime="2025-12-05 06:28:21.884066654 +0000 UTC m=+2121.164077876" watchObservedRunningTime="2025-12-05 06:28:21.884920268 +0000 UTC m=+2121.164931500" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.657534 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjbkg"] Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.659815 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.712039 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjbkg"] Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.770211 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87g26\" (UniqueName: \"kubernetes.io/projected/29f579d9-6001-4b5d-b4ab-be0e764fa46d-kube-api-access-87g26\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.770271 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-utilities\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.770305 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-catalog-content\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.872580 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87g26\" (UniqueName: \"kubernetes.io/projected/29f579d9-6001-4b5d-b4ab-be0e764fa46d-kube-api-access-87g26\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.872653 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-utilities\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.872677 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-catalog-content\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.873242 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-utilities\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.873256 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-catalog-content\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:25 crc kubenswrapper[4865]: I1205 06:28:25.896578 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87g26\" (UniqueName: \"kubernetes.io/projected/29f579d9-6001-4b5d-b4ab-be0e764fa46d-kube-api-access-87g26\") pod \"redhat-marketplace-rjbkg\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:26 crc kubenswrapper[4865]: I1205 06:28:26.014531 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:26 crc kubenswrapper[4865]: I1205 06:28:26.623110 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjbkg"] Dec 05 06:28:26 crc kubenswrapper[4865]: I1205 06:28:26.926754 4865 generic.go:334] "Generic (PLEG): container finished" podID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerID="25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50" exitCode=0 Dec 05 06:28:26 crc kubenswrapper[4865]: I1205 06:28:26.926818 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerDied","Data":"25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50"} Dec 05 06:28:26 crc kubenswrapper[4865]: I1205 06:28:26.926867 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerStarted","Data":"bf39ebc00862315784f68e394e6aaf1993fc95a3c947ea054665eb598c66ecda"} Dec 05 06:28:27 crc kubenswrapper[4865]: I1205 06:28:27.939315 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerStarted","Data":"a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498"} Dec 05 06:28:28 crc kubenswrapper[4865]: I1205 06:28:28.957962 4865 generic.go:334] "Generic (PLEG): container finished" podID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerID="a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498" exitCode=0 Dec 05 06:28:28 crc kubenswrapper[4865]: I1205 06:28:28.958038 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerDied","Data":"a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498"} Dec 05 06:28:29 crc kubenswrapper[4865]: I1205 06:28:29.973024 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerStarted","Data":"d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919"} Dec 05 06:28:30 crc kubenswrapper[4865]: I1205 06:28:30.002493 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjbkg" podStartSLOduration=2.563023505 podStartE2EDuration="5.002473329s" podCreationTimestamp="2025-12-05 06:28:25 +0000 UTC" firstStartedPulling="2025-12-05 06:28:26.929707998 +0000 UTC m=+2126.209719220" lastFinishedPulling="2025-12-05 06:28:29.369157792 +0000 UTC m=+2128.649169044" observedRunningTime="2025-12-05 06:28:29.997154078 +0000 UTC m=+2129.277165300" watchObservedRunningTime="2025-12-05 06:28:30.002473329 +0000 UTC m=+2129.282484551" Dec 05 06:28:32 crc kubenswrapper[4865]: E1205 06:28:32.339052 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bbde20d_cc33_4f77_857e_41bb96a20fe9.slice/crio-conmon-9e5c116dee6c0230f4214c702f4ad9280623fde3a62a12a4aa79003ee455ebee.scope\": RecentStats: unable to find data in memory cache]" Dec 05 06:28:33 crc kubenswrapper[4865]: I1205 06:28:33.001182 4865 generic.go:334] "Generic (PLEG): container finished" podID="4bbde20d-cc33-4f77-857e-41bb96a20fe9" containerID="9e5c116dee6c0230f4214c702f4ad9280623fde3a62a12a4aa79003ee455ebee" exitCode=0 Dec 05 06:28:33 crc kubenswrapper[4865]: I1205 06:28:33.001240 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" event={"ID":"4bbde20d-cc33-4f77-857e-41bb96a20fe9","Type":"ContainerDied","Data":"9e5c116dee6c0230f4214c702f4ad9280623fde3a62a12a4aa79003ee455ebee"} Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.513136 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.680967 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-inventory\") pod \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.681228 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmhqm\" (UniqueName: \"kubernetes.io/projected/4bbde20d-cc33-4f77-857e-41bb96a20fe9-kube-api-access-bmhqm\") pod \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.681258 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-ssh-key\") pod \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\" (UID: \"4bbde20d-cc33-4f77-857e-41bb96a20fe9\") " Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.686037 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbde20d-cc33-4f77-857e-41bb96a20fe9-kube-api-access-bmhqm" (OuterVolumeSpecName: "kube-api-access-bmhqm") pod "4bbde20d-cc33-4f77-857e-41bb96a20fe9" (UID: "4bbde20d-cc33-4f77-857e-41bb96a20fe9"). InnerVolumeSpecName "kube-api-access-bmhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.712897 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4bbde20d-cc33-4f77-857e-41bb96a20fe9" (UID: "4bbde20d-cc33-4f77-857e-41bb96a20fe9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.719573 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-inventory" (OuterVolumeSpecName: "inventory") pod "4bbde20d-cc33-4f77-857e-41bb96a20fe9" (UID: "4bbde20d-cc33-4f77-857e-41bb96a20fe9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.783873 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.783916 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmhqm\" (UniqueName: \"kubernetes.io/projected/4bbde20d-cc33-4f77-857e-41bb96a20fe9-kube-api-access-bmhqm\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:34 crc kubenswrapper[4865]: I1205 06:28:34.783927 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bbde20d-cc33-4f77-857e-41bb96a20fe9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.024056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" event={"ID":"4bbde20d-cc33-4f77-857e-41bb96a20fe9","Type":"ContainerDied","Data":"213364142be931a670e20294233dbe07aa8d12e245bb5c7e784f0f1ed0b028e7"} Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.024348 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="213364142be931a670e20294233dbe07aa8d12e245bb5c7e784f0f1ed0b028e7" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.024108 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.167977 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk"] Dec 05 06:28:35 crc kubenswrapper[4865]: E1205 06:28:35.168451 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbde20d-cc33-4f77-857e-41bb96a20fe9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.168467 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbde20d-cc33-4f77-857e-41bb96a20fe9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.168693 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbde20d-cc33-4f77-857e-41bb96a20fe9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.169403 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.172463 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.173249 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.173303 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.174290 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.174484 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.175051 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.175364 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.180096 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.186971 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk"] Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293548 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293586 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293696 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293733 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293775 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293799 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293830 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293886 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293906 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293940 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293957 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293976 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46bj\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-kube-api-access-j46bj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.293993 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.294012 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395431 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395525 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395549 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395586 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395602 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395622 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46bj\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-kube-api-access-j46bj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395640 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395662 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395682 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395698 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395758 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.395782 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.396621 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.396669 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.400813 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.402215 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.403478 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.403556 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.404566 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.405419 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.405897 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.406617 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.409428 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.411047 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.412127 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.416309 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.417992 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46bj\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-kube-api-access-j46bj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.422528 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-br9nk\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:35 crc kubenswrapper[4865]: I1205 06:28:35.493444 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:28:36 crc kubenswrapper[4865]: I1205 06:28:36.016341 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:36 crc kubenswrapper[4865]: I1205 06:28:36.016966 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:36 crc kubenswrapper[4865]: I1205 06:28:36.045958 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk"] Dec 05 06:28:36 crc kubenswrapper[4865]: I1205 06:28:36.103306 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:36 crc kubenswrapper[4865]: I1205 06:28:36.150451 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:37 crc kubenswrapper[4865]: I1205 06:28:37.053594 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" event={"ID":"3112d62b-5125-4614-a5c3-6a50bf1cc515","Type":"ContainerStarted","Data":"99747395e299a32c26161df96fc342041e17cefd6fdf164a1082e51364babab0"} Dec 05 06:28:37 crc kubenswrapper[4865]: I1205 06:28:37.055577 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" event={"ID":"3112d62b-5125-4614-a5c3-6a50bf1cc515","Type":"ContainerStarted","Data":"0306507ab2e71e5c042cb5dd662c291dae20b15116fda113bef8add92903fd74"} Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.043608 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" podStartSLOduration=2.6201588129999998 podStartE2EDuration="3.043587126s" podCreationTimestamp="2025-12-05 06:28:35 +0000 UTC" firstStartedPulling="2025-12-05 06:28:36.061564812 +0000 UTC m=+2135.341576034" lastFinishedPulling="2025-12-05 06:28:36.484993085 +0000 UTC m=+2135.765004347" observedRunningTime="2025-12-05 06:28:37.100304311 +0000 UTC m=+2136.380315533" watchObservedRunningTime="2025-12-05 06:28:38.043587126 +0000 UTC m=+2137.323598348" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.049307 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjbkg"] Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.062885 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rjbkg" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="registry-server" containerID="cri-o://d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919" gracePeriod=2 Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.538625 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.671027 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-catalog-content\") pod \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.671114 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87g26\" (UniqueName: \"kubernetes.io/projected/29f579d9-6001-4b5d-b4ab-be0e764fa46d-kube-api-access-87g26\") pod \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.671247 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-utilities\") pod \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\" (UID: \"29f579d9-6001-4b5d-b4ab-be0e764fa46d\") " Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.672651 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-utilities" (OuterVolumeSpecName: "utilities") pod "29f579d9-6001-4b5d-b4ab-be0e764fa46d" (UID: "29f579d9-6001-4b5d-b4ab-be0e764fa46d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.678913 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f579d9-6001-4b5d-b4ab-be0e764fa46d-kube-api-access-87g26" (OuterVolumeSpecName: "kube-api-access-87g26") pod "29f579d9-6001-4b5d-b4ab-be0e764fa46d" (UID: "29f579d9-6001-4b5d-b4ab-be0e764fa46d"). InnerVolumeSpecName "kube-api-access-87g26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.694601 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29f579d9-6001-4b5d-b4ab-be0e764fa46d" (UID: "29f579d9-6001-4b5d-b4ab-be0e764fa46d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.774064 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87g26\" (UniqueName: \"kubernetes.io/projected/29f579d9-6001-4b5d-b4ab-be0e764fa46d-kube-api-access-87g26\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.774122 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:38 crc kubenswrapper[4865]: I1205 06:28:38.774134 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f579d9-6001-4b5d-b4ab-be0e764fa46d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.092882 4865 generic.go:334] "Generic (PLEG): container finished" podID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerID="d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919" exitCode=0 Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.095039 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjbkg" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.095107 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerDied","Data":"d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919"} Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.096299 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjbkg" event={"ID":"29f579d9-6001-4b5d-b4ab-be0e764fa46d","Type":"ContainerDied","Data":"bf39ebc00862315784f68e394e6aaf1993fc95a3c947ea054665eb598c66ecda"} Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.096411 4865 scope.go:117] "RemoveContainer" containerID="d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.126869 4865 scope.go:117] "RemoveContainer" containerID="a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.130000 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjbkg"] Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.140315 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjbkg"] Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.150856 4865 scope.go:117] "RemoveContainer" containerID="25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.201712 4865 scope.go:117] "RemoveContainer" containerID="d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919" Dec 05 06:28:39 crc kubenswrapper[4865]: E1205 06:28:39.203238 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919\": container with ID starting with d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919 not found: ID does not exist" containerID="d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.203380 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919"} err="failed to get container status \"d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919\": rpc error: code = NotFound desc = could not find container \"d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919\": container with ID starting with d7ebcf268ddb9fb4758ca143a1fc10fd92af0ca8963c3118ca13a82b5791c919 not found: ID does not exist" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.203489 4865 scope.go:117] "RemoveContainer" containerID="a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498" Dec 05 06:28:39 crc kubenswrapper[4865]: E1205 06:28:39.203908 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498\": container with ID starting with a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498 not found: ID does not exist" containerID="a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.203964 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498"} err="failed to get container status \"a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498\": rpc error: code = NotFound desc = could not find container \"a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498\": container with ID starting with a5b8526e579cd57b4e1da7e765b0be2eda354a7147ecbe56f14ed04f8839e498 not found: ID does not exist" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.204010 4865 scope.go:117] "RemoveContainer" containerID="25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50" Dec 05 06:28:39 crc kubenswrapper[4865]: E1205 06:28:39.204585 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50\": container with ID starting with 25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50 not found: ID does not exist" containerID="25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50" Dec 05 06:28:39 crc kubenswrapper[4865]: I1205 06:28:39.204608 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50"} err="failed to get container status \"25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50\": rpc error: code = NotFound desc = could not find container \"25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50\": container with ID starting with 25f3a0ecfff0edbb94c24b00dc8c990d65b05d270bd27c3956d8ec2ffae7ed50 not found: ID does not exist" Dec 05 06:28:41 crc kubenswrapper[4865]: I1205 06:28:41.025435 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" path="/var/lib/kubelet/pods/29f579d9-6001-4b5d-b4ab-be0e764fa46d/volumes" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.078568 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6f6k6"] Dec 05 06:28:54 crc kubenswrapper[4865]: E1205 06:28:54.081048 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="registry-server" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.081136 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="registry-server" Dec 05 06:28:54 crc kubenswrapper[4865]: E1205 06:28:54.081209 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="extract-content" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.081289 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="extract-content" Dec 05 06:28:54 crc kubenswrapper[4865]: E1205 06:28:54.081373 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="extract-utilities" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.081444 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="extract-utilities" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.081699 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f579d9-6001-4b5d-b4ab-be0e764fa46d" containerName="registry-server" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.083389 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.091522 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6f6k6"] Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.218185 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-catalog-content\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.218449 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-utilities\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.218506 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vplng\" (UniqueName: \"kubernetes.io/projected/5007a243-d5f7-415a-b42a-0c97a660763b-kube-api-access-vplng\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.320497 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-utilities\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.320992 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vplng\" (UniqueName: \"kubernetes.io/projected/5007a243-d5f7-415a-b42a-0c97a660763b-kube-api-access-vplng\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.321031 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-utilities\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.321202 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-catalog-content\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.321526 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-catalog-content\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.343792 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vplng\" (UniqueName: \"kubernetes.io/projected/5007a243-d5f7-415a-b42a-0c97a660763b-kube-api-access-vplng\") pod \"community-operators-6f6k6\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:54 crc kubenswrapper[4865]: I1205 06:28:54.412713 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:28:55 crc kubenswrapper[4865]: I1205 06:28:55.041987 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6f6k6"] Dec 05 06:28:55 crc kubenswrapper[4865]: I1205 06:28:55.241810 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerStarted","Data":"d87f02a3b6e9b064315963875bae3903924f99258f999321db8206a74962ce54"} Dec 05 06:28:56 crc kubenswrapper[4865]: I1205 06:28:56.252169 4865 generic.go:334] "Generic (PLEG): container finished" podID="5007a243-d5f7-415a-b42a-0c97a660763b" containerID="015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91" exitCode=0 Dec 05 06:28:56 crc kubenswrapper[4865]: I1205 06:28:56.252266 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerDied","Data":"015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91"} Dec 05 06:28:57 crc kubenswrapper[4865]: I1205 06:28:57.269116 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerStarted","Data":"515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd"} Dec 05 06:28:59 crc kubenswrapper[4865]: I1205 06:28:59.289783 4865 generic.go:334] "Generic (PLEG): container finished" podID="5007a243-d5f7-415a-b42a-0c97a660763b" containerID="515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd" exitCode=0 Dec 05 06:28:59 crc kubenswrapper[4865]: I1205 06:28:59.290311 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerDied","Data":"515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd"} Dec 05 06:29:00 crc kubenswrapper[4865]: I1205 06:29:00.302626 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerStarted","Data":"24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632"} Dec 05 06:29:00 crc kubenswrapper[4865]: I1205 06:29:00.332358 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6f6k6" podStartSLOduration=2.9137655049999998 podStartE2EDuration="6.332330873s" podCreationTimestamp="2025-12-05 06:28:54 +0000 UTC" firstStartedPulling="2025-12-05 06:28:56.254750857 +0000 UTC m=+2155.534762079" lastFinishedPulling="2025-12-05 06:28:59.673316215 +0000 UTC m=+2158.953327447" observedRunningTime="2025-12-05 06:29:00.320240679 +0000 UTC m=+2159.600251911" watchObservedRunningTime="2025-12-05 06:29:00.332330873 +0000 UTC m=+2159.612342095" Dec 05 06:29:04 crc kubenswrapper[4865]: I1205 06:29:04.413100 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:29:04 crc kubenswrapper[4865]: I1205 06:29:04.413614 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:29:04 crc kubenswrapper[4865]: I1205 06:29:04.469756 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:29:05 crc kubenswrapper[4865]: I1205 06:29:05.448734 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:29:05 crc kubenswrapper[4865]: I1205 06:29:05.513342 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6f6k6"] Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.365924 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6f6k6" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="registry-server" containerID="cri-o://24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632" gracePeriod=2 Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.831051 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.914262 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-utilities\") pod \"5007a243-d5f7-415a-b42a-0c97a660763b\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.914340 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vplng\" (UniqueName: \"kubernetes.io/projected/5007a243-d5f7-415a-b42a-0c97a660763b-kube-api-access-vplng\") pod \"5007a243-d5f7-415a-b42a-0c97a660763b\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.914502 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-catalog-content\") pod \"5007a243-d5f7-415a-b42a-0c97a660763b\" (UID: \"5007a243-d5f7-415a-b42a-0c97a660763b\") " Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.915121 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-utilities" (OuterVolumeSpecName: "utilities") pod "5007a243-d5f7-415a-b42a-0c97a660763b" (UID: "5007a243-d5f7-415a-b42a-0c97a660763b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.923223 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5007a243-d5f7-415a-b42a-0c97a660763b-kube-api-access-vplng" (OuterVolumeSpecName: "kube-api-access-vplng") pod "5007a243-d5f7-415a-b42a-0c97a660763b" (UID: "5007a243-d5f7-415a-b42a-0c97a660763b"). InnerVolumeSpecName "kube-api-access-vplng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:29:07 crc kubenswrapper[4865]: I1205 06:29:07.961605 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5007a243-d5f7-415a-b42a-0c97a660763b" (UID: "5007a243-d5f7-415a-b42a-0c97a660763b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.017019 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.017062 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5007a243-d5f7-415a-b42a-0c97a660763b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.017077 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vplng\" (UniqueName: \"kubernetes.io/projected/5007a243-d5f7-415a-b42a-0c97a660763b-kube-api-access-vplng\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.382660 4865 generic.go:334] "Generic (PLEG): container finished" podID="5007a243-d5f7-415a-b42a-0c97a660763b" containerID="24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632" exitCode=0 Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.382714 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerDied","Data":"24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632"} Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.382749 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6f6k6" event={"ID":"5007a243-d5f7-415a-b42a-0c97a660763b","Type":"ContainerDied","Data":"d87f02a3b6e9b064315963875bae3903924f99258f999321db8206a74962ce54"} Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.382771 4865 scope.go:117] "RemoveContainer" containerID="24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.383668 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6f6k6" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.409600 4865 scope.go:117] "RemoveContainer" containerID="515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.430271 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6f6k6"] Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.440747 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6f6k6"] Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.443895 4865 scope.go:117] "RemoveContainer" containerID="015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.489427 4865 scope.go:117] "RemoveContainer" containerID="24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632" Dec 05 06:29:08 crc kubenswrapper[4865]: E1205 06:29:08.489813 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632\": container with ID starting with 24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632 not found: ID does not exist" containerID="24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.489960 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632"} err="failed to get container status \"24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632\": rpc error: code = NotFound desc = could not find container \"24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632\": container with ID starting with 24b7212b31399b61dd6ac119ba252233daf4da00a87ce01a90783965ff0fb632 not found: ID does not exist" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.490013 4865 scope.go:117] "RemoveContainer" containerID="515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd" Dec 05 06:29:08 crc kubenswrapper[4865]: E1205 06:29:08.490275 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd\": container with ID starting with 515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd not found: ID does not exist" containerID="515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.490301 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd"} err="failed to get container status \"515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd\": rpc error: code = NotFound desc = could not find container \"515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd\": container with ID starting with 515f58fbfa6519b9834281f2de38f0fbc03a599b010b3364308558085f9c53bd not found: ID does not exist" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.490318 4865 scope.go:117] "RemoveContainer" containerID="015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91" Dec 05 06:29:08 crc kubenswrapper[4865]: E1205 06:29:08.490619 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91\": container with ID starting with 015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91 not found: ID does not exist" containerID="015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91" Dec 05 06:29:08 crc kubenswrapper[4865]: I1205 06:29:08.490644 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91"} err="failed to get container status \"015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91\": rpc error: code = NotFound desc = could not find container \"015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91\": container with ID starting with 015d402f4d9bd46417f71bd0ad06286dbcb4df4199d269dcf2214489b8019c91 not found: ID does not exist" Dec 05 06:29:09 crc kubenswrapper[4865]: I1205 06:29:09.019018 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" path="/var/lib/kubelet/pods/5007a243-d5f7-415a-b42a-0c97a660763b/volumes" Dec 05 06:29:11 crc kubenswrapper[4865]: I1205 06:29:11.049195 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:29:11 crc kubenswrapper[4865]: I1205 06:29:11.050071 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:29:20 crc kubenswrapper[4865]: I1205 06:29:20.507521 4865 generic.go:334] "Generic (PLEG): container finished" podID="3112d62b-5125-4614-a5c3-6a50bf1cc515" containerID="99747395e299a32c26161df96fc342041e17cefd6fdf164a1082e51364babab0" exitCode=0 Dec 05 06:29:20 crc kubenswrapper[4865]: I1205 06:29:20.507594 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" event={"ID":"3112d62b-5125-4614-a5c3-6a50bf1cc515","Type":"ContainerDied","Data":"99747395e299a32c26161df96fc342041e17cefd6fdf164a1082e51364babab0"} Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.912456 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983085 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-nova-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983157 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ssh-key\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983191 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-bootstrap-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983225 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-inventory\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983274 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-ovn-default-certs-0\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983324 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-repo-setup-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983376 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983404 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.983423 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-telemetry-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.984087 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.984130 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-neutron-metadata-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.984158 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-libvirt-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.984253 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ovn-combined-ca-bundle\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.984277 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46bj\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-kube-api-access-j46bj\") pod \"3112d62b-5125-4614-a5c3-6a50bf1cc515\" (UID: \"3112d62b-5125-4614-a5c3-6a50bf1cc515\") " Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.988912 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.989379 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.989488 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.991372 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.991618 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.994046 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-kube-api-access-j46bj" (OuterVolumeSpecName: "kube-api-access-j46bj") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "kube-api-access-j46bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.994092 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.994463 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.996305 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:21 crc kubenswrapper[4865]: I1205 06:29:21.998444 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.000433 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.012998 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.018314 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-inventory" (OuterVolumeSpecName: "inventory") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.035013 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3112d62b-5125-4614-a5c3-6a50bf1cc515" (UID: "3112d62b-5125-4614-a5c3-6a50bf1cc515"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086611 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086640 4865 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086650 4865 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086660 4865 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086669 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46bj\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-kube-api-access-j46bj\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086677 4865 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086685 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086693 4865 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086701 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086711 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086719 4865 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086731 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086741 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3112d62b-5125-4614-a5c3-6a50bf1cc515-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.086749 4865 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3112d62b-5125-4614-a5c3-6a50bf1cc515-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.529069 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" event={"ID":"3112d62b-5125-4614-a5c3-6a50bf1cc515","Type":"ContainerDied","Data":"0306507ab2e71e5c042cb5dd662c291dae20b15116fda113bef8add92903fd74"} Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.529120 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0306507ab2e71e5c042cb5dd662c291dae20b15116fda113bef8add92903fd74" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.529189 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-br9nk" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.691409 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8"] Dec 05 06:29:22 crc kubenswrapper[4865]: E1205 06:29:22.692161 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="extract-content" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.692187 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="extract-content" Dec 05 06:29:22 crc kubenswrapper[4865]: E1205 06:29:22.692221 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3112d62b-5125-4614-a5c3-6a50bf1cc515" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.692231 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3112d62b-5125-4614-a5c3-6a50bf1cc515" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 06:29:22 crc kubenswrapper[4865]: E1205 06:29:22.692251 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="extract-utilities" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.692260 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="extract-utilities" Dec 05 06:29:22 crc kubenswrapper[4865]: E1205 06:29:22.692277 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="registry-server" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.692284 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="registry-server" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.692500 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="3112d62b-5125-4614-a5c3-6a50bf1cc515" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.692533 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5007a243-d5f7-415a-b42a-0c97a660763b" containerName="registry-server" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.693338 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.696615 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.697362 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.697501 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.697622 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.698959 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.709017 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8"] Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.798315 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64b4\" (UniqueName: \"kubernetes.io/projected/d6e75882-16f5-4c56-90a8-43d35503e87d-kube-api-access-c64b4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.798393 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.798645 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d6e75882-16f5-4c56-90a8-43d35503e87d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.798753 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.798859 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.900512 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d6e75882-16f5-4c56-90a8-43d35503e87d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.900567 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.900608 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.900687 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c64b4\" (UniqueName: \"kubernetes.io/projected/d6e75882-16f5-4c56-90a8-43d35503e87d-kube-api-access-c64b4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.900741 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.901496 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d6e75882-16f5-4c56-90a8-43d35503e87d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.904710 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.904783 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.905527 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:22 crc kubenswrapper[4865]: I1205 06:29:22.926670 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64b4\" (UniqueName: \"kubernetes.io/projected/d6e75882-16f5-4c56-90a8-43d35503e87d-kube-api-access-c64b4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xr5z8\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:23 crc kubenswrapper[4865]: I1205 06:29:23.025745 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:29:23 crc kubenswrapper[4865]: I1205 06:29:23.632420 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8"] Dec 05 06:29:24 crc kubenswrapper[4865]: I1205 06:29:24.552343 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" event={"ID":"d6e75882-16f5-4c56-90a8-43d35503e87d","Type":"ContainerStarted","Data":"be0a0c799589a7d6d9be19ccd24c541c48497be5d05c43b0e0f113582c408c64"} Dec 05 06:29:24 crc kubenswrapper[4865]: I1205 06:29:24.552398 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" event={"ID":"d6e75882-16f5-4c56-90a8-43d35503e87d","Type":"ContainerStarted","Data":"361d3b389768feda8327e7b8da90b3e554abc3ef17a443540fbf2dbab42fe6d2"} Dec 05 06:29:24 crc kubenswrapper[4865]: I1205 06:29:24.581182 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" podStartSLOduration=2.097756262 podStartE2EDuration="2.581160999s" podCreationTimestamp="2025-12-05 06:29:22 +0000 UTC" firstStartedPulling="2025-12-05 06:29:23.622802195 +0000 UTC m=+2182.902813417" lastFinishedPulling="2025-12-05 06:29:24.106206942 +0000 UTC m=+2183.386218154" observedRunningTime="2025-12-05 06:29:24.577998599 +0000 UTC m=+2183.858009841" watchObservedRunningTime="2025-12-05 06:29:24.581160999 +0000 UTC m=+2183.861172241" Dec 05 06:29:41 crc kubenswrapper[4865]: I1205 06:29:41.049430 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:29:41 crc kubenswrapper[4865]: I1205 06:29:41.050226 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.172704 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs"] Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.175613 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.177581 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.178019 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.195103 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs"] Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.289297 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03a87491-ec02-4ff3-be61-05e7e49f9637-config-volume\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.289766 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjh6n\" (UniqueName: \"kubernetes.io/projected/03a87491-ec02-4ff3-be61-05e7e49f9637-kube-api-access-bjh6n\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.289986 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03a87491-ec02-4ff3-be61-05e7e49f9637-secret-volume\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.392057 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjh6n\" (UniqueName: \"kubernetes.io/projected/03a87491-ec02-4ff3-be61-05e7e49f9637-kube-api-access-bjh6n\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.392438 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03a87491-ec02-4ff3-be61-05e7e49f9637-secret-volume\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.392484 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03a87491-ec02-4ff3-be61-05e7e49f9637-config-volume\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.393441 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03a87491-ec02-4ff3-be61-05e7e49f9637-config-volume\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.398645 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03a87491-ec02-4ff3-be61-05e7e49f9637-secret-volume\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.417550 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjh6n\" (UniqueName: \"kubernetes.io/projected/03a87491-ec02-4ff3-be61-05e7e49f9637-kube-api-access-bjh6n\") pod \"collect-profiles-29415270-6n2zs\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.507963 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:00 crc kubenswrapper[4865]: I1205 06:30:00.964193 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs"] Dec 05 06:30:00 crc kubenswrapper[4865]: W1205 06:30:00.980399 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a87491_ec02_4ff3_be61_05e7e49f9637.slice/crio-729ab56da12bc8c315e2d998a3a591fd4fdc6a71048cbb398811baf3716baf28 WatchSource:0}: Error finding container 729ab56da12bc8c315e2d998a3a591fd4fdc6a71048cbb398811baf3716baf28: Status 404 returned error can't find the container with id 729ab56da12bc8c315e2d998a3a591fd4fdc6a71048cbb398811baf3716baf28 Dec 05 06:30:01 crc kubenswrapper[4865]: I1205 06:30:01.923880 4865 generic.go:334] "Generic (PLEG): container finished" podID="03a87491-ec02-4ff3-be61-05e7e49f9637" containerID="30ea87cf3a1797e349345f4fa69f145aaf13577a9e108ca45543f4c4944b90f0" exitCode=0 Dec 05 06:30:01 crc kubenswrapper[4865]: I1205 06:30:01.923979 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" event={"ID":"03a87491-ec02-4ff3-be61-05e7e49f9637","Type":"ContainerDied","Data":"30ea87cf3a1797e349345f4fa69f145aaf13577a9e108ca45543f4c4944b90f0"} Dec 05 06:30:01 crc kubenswrapper[4865]: I1205 06:30:01.924289 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" event={"ID":"03a87491-ec02-4ff3-be61-05e7e49f9637","Type":"ContainerStarted","Data":"729ab56da12bc8c315e2d998a3a591fd4fdc6a71048cbb398811baf3716baf28"} Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.256472 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.347990 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03a87491-ec02-4ff3-be61-05e7e49f9637-secret-volume\") pod \"03a87491-ec02-4ff3-be61-05e7e49f9637\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.348264 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjh6n\" (UniqueName: \"kubernetes.io/projected/03a87491-ec02-4ff3-be61-05e7e49f9637-kube-api-access-bjh6n\") pod \"03a87491-ec02-4ff3-be61-05e7e49f9637\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.348343 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03a87491-ec02-4ff3-be61-05e7e49f9637-config-volume\") pod \"03a87491-ec02-4ff3-be61-05e7e49f9637\" (UID: \"03a87491-ec02-4ff3-be61-05e7e49f9637\") " Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.349574 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a87491-ec02-4ff3-be61-05e7e49f9637-config-volume" (OuterVolumeSpecName: "config-volume") pod "03a87491-ec02-4ff3-be61-05e7e49f9637" (UID: "03a87491-ec02-4ff3-be61-05e7e49f9637"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.358956 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a87491-ec02-4ff3-be61-05e7e49f9637-kube-api-access-bjh6n" (OuterVolumeSpecName: "kube-api-access-bjh6n") pod "03a87491-ec02-4ff3-be61-05e7e49f9637" (UID: "03a87491-ec02-4ff3-be61-05e7e49f9637"). InnerVolumeSpecName "kube-api-access-bjh6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.360614 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a87491-ec02-4ff3-be61-05e7e49f9637-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03a87491-ec02-4ff3-be61-05e7e49f9637" (UID: "03a87491-ec02-4ff3-be61-05e7e49f9637"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.451171 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03a87491-ec02-4ff3-be61-05e7e49f9637-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.451534 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjh6n\" (UniqueName: \"kubernetes.io/projected/03a87491-ec02-4ff3-be61-05e7e49f9637-kube-api-access-bjh6n\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.451548 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03a87491-ec02-4ff3-be61-05e7e49f9637-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.948448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" event={"ID":"03a87491-ec02-4ff3-be61-05e7e49f9637","Type":"ContainerDied","Data":"729ab56da12bc8c315e2d998a3a591fd4fdc6a71048cbb398811baf3716baf28"} Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.948493 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="729ab56da12bc8c315e2d998a3a591fd4fdc6a71048cbb398811baf3716baf28" Dec 05 06:30:03 crc kubenswrapper[4865]: I1205 06:30:03.948559 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs" Dec 05 06:30:04 crc kubenswrapper[4865]: I1205 06:30:04.348084 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb"] Dec 05 06:30:04 crc kubenswrapper[4865]: I1205 06:30:04.355326 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415225-7pnfb"] Dec 05 06:30:05 crc kubenswrapper[4865]: I1205 06:30:05.024093 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60e1629-83b6-4492-bd6c-c0ed90da02be" path="/var/lib/kubelet/pods/d60e1629-83b6-4492-bd6c-c0ed90da02be/volumes" Dec 05 06:30:11 crc kubenswrapper[4865]: I1205 06:30:11.049433 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:30:11 crc kubenswrapper[4865]: I1205 06:30:11.049941 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:30:11 crc kubenswrapper[4865]: I1205 06:30:11.049983 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:30:11 crc kubenswrapper[4865]: I1205 06:30:11.050666 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:30:11 crc kubenswrapper[4865]: I1205 06:30:11.050708 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" gracePeriod=600 Dec 05 06:30:11 crc kubenswrapper[4865]: E1205 06:30:11.177557 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:30:12 crc kubenswrapper[4865]: I1205 06:30:12.026743 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" exitCode=0 Dec 05 06:30:12 crc kubenswrapper[4865]: I1205 06:30:12.026799 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a"} Dec 05 06:30:12 crc kubenswrapper[4865]: I1205 06:30:12.026863 4865 scope.go:117] "RemoveContainer" containerID="8298b19a58cec01e49fca2a020c44af4ff6818830b5baef2bd67270b8780a994" Dec 05 06:30:12 crc kubenswrapper[4865]: I1205 06:30:12.027654 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:30:12 crc kubenswrapper[4865]: E1205 06:30:12.028256 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:30:27 crc kubenswrapper[4865]: I1205 06:30:27.007415 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:30:27 crc kubenswrapper[4865]: E1205 06:30:27.008640 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:30:38 crc kubenswrapper[4865]: I1205 06:30:38.006672 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:30:38 crc kubenswrapper[4865]: E1205 06:30:38.007282 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:30:38 crc kubenswrapper[4865]: I1205 06:30:38.995139 4865 scope.go:117] "RemoveContainer" containerID="600c03f669aea52a78fb1980f4670028153595dc18e964c5d30397e0e471a32e" Dec 05 06:30:45 crc kubenswrapper[4865]: I1205 06:30:45.374027 4865 generic.go:334] "Generic (PLEG): container finished" podID="d6e75882-16f5-4c56-90a8-43d35503e87d" containerID="be0a0c799589a7d6d9be19ccd24c541c48497be5d05c43b0e0f113582c408c64" exitCode=0 Dec 05 06:30:45 crc kubenswrapper[4865]: I1205 06:30:45.374208 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" event={"ID":"d6e75882-16f5-4c56-90a8-43d35503e87d","Type":"ContainerDied","Data":"be0a0c799589a7d6d9be19ccd24c541c48497be5d05c43b0e0f113582c408c64"} Dec 05 06:30:46 crc kubenswrapper[4865]: I1205 06:30:46.918869 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.037337 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d6e75882-16f5-4c56-90a8-43d35503e87d-ovncontroller-config-0\") pod \"d6e75882-16f5-4c56-90a8-43d35503e87d\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.037431 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ssh-key\") pod \"d6e75882-16f5-4c56-90a8-43d35503e87d\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.037469 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-inventory\") pod \"d6e75882-16f5-4c56-90a8-43d35503e87d\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.037542 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c64b4\" (UniqueName: \"kubernetes.io/projected/d6e75882-16f5-4c56-90a8-43d35503e87d-kube-api-access-c64b4\") pod \"d6e75882-16f5-4c56-90a8-43d35503e87d\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.037557 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ovn-combined-ca-bundle\") pod \"d6e75882-16f5-4c56-90a8-43d35503e87d\" (UID: \"d6e75882-16f5-4c56-90a8-43d35503e87d\") " Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.042933 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d6e75882-16f5-4c56-90a8-43d35503e87d" (UID: "d6e75882-16f5-4c56-90a8-43d35503e87d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.044638 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e75882-16f5-4c56-90a8-43d35503e87d-kube-api-access-c64b4" (OuterVolumeSpecName: "kube-api-access-c64b4") pod "d6e75882-16f5-4c56-90a8-43d35503e87d" (UID: "d6e75882-16f5-4c56-90a8-43d35503e87d"). InnerVolumeSpecName "kube-api-access-c64b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.072518 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-inventory" (OuterVolumeSpecName: "inventory") pod "d6e75882-16f5-4c56-90a8-43d35503e87d" (UID: "d6e75882-16f5-4c56-90a8-43d35503e87d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.083199 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e75882-16f5-4c56-90a8-43d35503e87d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d6e75882-16f5-4c56-90a8-43d35503e87d" (UID: "d6e75882-16f5-4c56-90a8-43d35503e87d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.088157 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6e75882-16f5-4c56-90a8-43d35503e87d" (UID: "d6e75882-16f5-4c56-90a8-43d35503e87d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.139610 4865 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d6e75882-16f5-4c56-90a8-43d35503e87d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.139802 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.139914 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.139993 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c64b4\" (UniqueName: \"kubernetes.io/projected/d6e75882-16f5-4c56-90a8-43d35503e87d-kube-api-access-c64b4\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.140064 4865 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e75882-16f5-4c56-90a8-43d35503e87d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.394001 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" event={"ID":"d6e75882-16f5-4c56-90a8-43d35503e87d","Type":"ContainerDied","Data":"361d3b389768feda8327e7b8da90b3e554abc3ef17a443540fbf2dbab42fe6d2"} Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.394424 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361d3b389768feda8327e7b8da90b3e554abc3ef17a443540fbf2dbab42fe6d2" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.394249 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xr5z8" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.495687 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb"] Dec 05 06:30:47 crc kubenswrapper[4865]: E1205 06:30:47.496563 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e75882-16f5-4c56-90a8-43d35503e87d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.496633 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e75882-16f5-4c56-90a8-43d35503e87d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 06:30:47 crc kubenswrapper[4865]: E1205 06:30:47.496708 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a87491-ec02-4ff3-be61-05e7e49f9637" containerName="collect-profiles" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.496766 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a87491-ec02-4ff3-be61-05e7e49f9637" containerName="collect-profiles" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.497027 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a87491-ec02-4ff3-be61-05e7e49f9637" containerName="collect-profiles" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.497087 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e75882-16f5-4c56-90a8-43d35503e87d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.497690 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.500293 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.500420 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.500491 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.500293 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.500953 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.501417 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.536831 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb"] Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.645814 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.646169 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.646332 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.646419 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.646522 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjfh\" (UniqueName: \"kubernetes.io/projected/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-kube-api-access-zsjfh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.646598 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.748168 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.748218 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.748269 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjfh\" (UniqueName: \"kubernetes.io/projected/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-kube-api-access-zsjfh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.748286 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.748346 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.748370 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.752908 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.752888 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.755179 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.756679 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.763099 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.771032 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjfh\" (UniqueName: \"kubernetes.io/projected/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-kube-api-access-zsjfh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:47 crc kubenswrapper[4865]: I1205 06:30:47.829073 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:30:48 crc kubenswrapper[4865]: I1205 06:30:48.350149 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb"] Dec 05 06:30:48 crc kubenswrapper[4865]: W1205 06:30:48.351736 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7073b1ac_84a6_4dc4_9ccb_4e8b711a34e9.slice/crio-c3348069a2aad4315f34114848ab398314614c2b8af1a20451ff3c853bb704cb WatchSource:0}: Error finding container c3348069a2aad4315f34114848ab398314614c2b8af1a20451ff3c853bb704cb: Status 404 returned error can't find the container with id c3348069a2aad4315f34114848ab398314614c2b8af1a20451ff3c853bb704cb Dec 05 06:30:48 crc kubenswrapper[4865]: I1205 06:30:48.403446 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" event={"ID":"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9","Type":"ContainerStarted","Data":"c3348069a2aad4315f34114848ab398314614c2b8af1a20451ff3c853bb704cb"} Dec 05 06:30:49 crc kubenswrapper[4865]: I1205 06:30:49.007295 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:30:49 crc kubenswrapper[4865]: E1205 06:30:49.008073 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:30:50 crc kubenswrapper[4865]: I1205 06:30:50.428964 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" event={"ID":"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9","Type":"ContainerStarted","Data":"2ca1947be7789f2f9a5f124ed2940d9eb2d299b79748cd26d1e39be2da2c974f"} Dec 05 06:30:50 crc kubenswrapper[4865]: I1205 06:30:50.455521 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" podStartSLOduration=2.588002549 podStartE2EDuration="3.455498613s" podCreationTimestamp="2025-12-05 06:30:47 +0000 UTC" firstStartedPulling="2025-12-05 06:30:48.353703801 +0000 UTC m=+2267.633715023" lastFinishedPulling="2025-12-05 06:30:49.221199825 +0000 UTC m=+2268.501211087" observedRunningTime="2025-12-05 06:30:50.447990329 +0000 UTC m=+2269.728001561" watchObservedRunningTime="2025-12-05 06:30:50.455498613 +0000 UTC m=+2269.735509835" Dec 05 06:31:00 crc kubenswrapper[4865]: I1205 06:31:00.006778 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:31:00 crc kubenswrapper[4865]: E1205 06:31:00.007527 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:31:12 crc kubenswrapper[4865]: I1205 06:31:12.006896 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:31:12 crc kubenswrapper[4865]: E1205 06:31:12.007646 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:31:23 crc kubenswrapper[4865]: I1205 06:31:23.006291 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:31:23 crc kubenswrapper[4865]: E1205 06:31:23.006952 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:31:35 crc kubenswrapper[4865]: I1205 06:31:35.007121 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:31:35 crc kubenswrapper[4865]: E1205 06:31:35.007711 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:31:46 crc kubenswrapper[4865]: I1205 06:31:46.006808 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:31:46 crc kubenswrapper[4865]: E1205 06:31:46.007866 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:31:46 crc kubenswrapper[4865]: I1205 06:31:46.915259 4865 generic.go:334] "Generic (PLEG): container finished" podID="7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" containerID="2ca1947be7789f2f9a5f124ed2940d9eb2d299b79748cd26d1e39be2da2c974f" exitCode=0 Dec 05 06:31:46 crc kubenswrapper[4865]: I1205 06:31:46.915314 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" event={"ID":"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9","Type":"ContainerDied","Data":"2ca1947be7789f2f9a5f124ed2940d9eb2d299b79748cd26d1e39be2da2c974f"} Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.346460 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.403134 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsjfh\" (UniqueName: \"kubernetes.io/projected/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-kube-api-access-zsjfh\") pod \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.403398 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-metadata-combined-ca-bundle\") pod \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.403449 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.403481 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-ssh-key\") pod \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.403525 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-nova-metadata-neutron-config-0\") pod \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.403554 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-inventory\") pod \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\" (UID: \"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9\") " Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.412055 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" (UID: "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.413938 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-kube-api-access-zsjfh" (OuterVolumeSpecName: "kube-api-access-zsjfh") pod "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" (UID: "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9"). InnerVolumeSpecName "kube-api-access-zsjfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.430766 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-inventory" (OuterVolumeSpecName: "inventory") pod "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" (UID: "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.435798 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" (UID: "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.450093 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" (UID: "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.462100 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" (UID: "7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.507037 4865 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.507632 4865 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.507742 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.507879 4865 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.507993 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.508097 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsjfh\" (UniqueName: \"kubernetes.io/projected/7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9-kube-api-access-zsjfh\") on node \"crc\" DevicePath \"\"" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.931281 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" event={"ID":"7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9","Type":"ContainerDied","Data":"c3348069a2aad4315f34114848ab398314614c2b8af1a20451ff3c853bb704cb"} Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.931325 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3348069a2aad4315f34114848ab398314614c2b8af1a20451ff3c853bb704cb" Dec 05 06:31:48 crc kubenswrapper[4865]: I1205 06:31:48.931330 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.047134 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt"] Dec 05 06:31:49 crc kubenswrapper[4865]: E1205 06:31:49.047606 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.047628 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.047853 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.048581 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.052122 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.053434 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.053507 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.053555 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.054044 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.060417 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt"] Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.118505 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.118652 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7pz\" (UniqueName: \"kubernetes.io/projected/ef2fa284-2648-4c53-8443-e60705efb609-kube-api-access-ww7pz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.118731 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.118769 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.118811 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.220204 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.220261 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.220349 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7pz\" (UniqueName: \"kubernetes.io/projected/ef2fa284-2648-4c53-8443-e60705efb609-kube-api-access-ww7pz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.220414 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.220445 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.224887 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.226562 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.226571 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.226607 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.238935 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7pz\" (UniqueName: \"kubernetes.io/projected/ef2fa284-2648-4c53-8443-e60705efb609-kube-api-access-ww7pz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rvldt\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:49 crc kubenswrapper[4865]: I1205 06:31:49.366248 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:31:50 crc kubenswrapper[4865]: I1205 06:31:50.028650 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt"] Dec 05 06:31:50 crc kubenswrapper[4865]: W1205 06:31:50.029530 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef2fa284_2648_4c53_8443_e60705efb609.slice/crio-c17763a583a3e97f4bd3e8adb3ec1c6f95f4c71a5d9fed45b170d39f8f7dbe22 WatchSource:0}: Error finding container c17763a583a3e97f4bd3e8adb3ec1c6f95f4c71a5d9fed45b170d39f8f7dbe22: Status 404 returned error can't find the container with id c17763a583a3e97f4bd3e8adb3ec1c6f95f4c71a5d9fed45b170d39f8f7dbe22 Dec 05 06:31:50 crc kubenswrapper[4865]: I1205 06:31:50.977228 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" event={"ID":"ef2fa284-2648-4c53-8443-e60705efb609","Type":"ContainerStarted","Data":"2dc15a6353727b9acf40a8e837ec8a76dcc1ecce2184cd127eef71687bd594fe"} Dec 05 06:31:50 crc kubenswrapper[4865]: I1205 06:31:50.977623 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" event={"ID":"ef2fa284-2648-4c53-8443-e60705efb609","Type":"ContainerStarted","Data":"c17763a583a3e97f4bd3e8adb3ec1c6f95f4c71a5d9fed45b170d39f8f7dbe22"} Dec 05 06:31:51 crc kubenswrapper[4865]: I1205 06:31:51.000112 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" podStartSLOduration=1.503376574 podStartE2EDuration="2.00008878s" podCreationTimestamp="2025-12-05 06:31:49 +0000 UTC" firstStartedPulling="2025-12-05 06:31:50.034074468 +0000 UTC m=+2329.314085680" lastFinishedPulling="2025-12-05 06:31:50.530786624 +0000 UTC m=+2329.810797886" observedRunningTime="2025-12-05 06:31:50.994220802 +0000 UTC m=+2330.274232064" watchObservedRunningTime="2025-12-05 06:31:51.00008878 +0000 UTC m=+2330.280100042" Dec 05 06:31:59 crc kubenswrapper[4865]: I1205 06:31:59.007400 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:31:59 crc kubenswrapper[4865]: E1205 06:31:59.008245 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:32:11 crc kubenswrapper[4865]: I1205 06:32:11.011921 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:32:11 crc kubenswrapper[4865]: E1205 06:32:11.012640 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:32:25 crc kubenswrapper[4865]: I1205 06:32:25.006950 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:32:25 crc kubenswrapper[4865]: E1205 06:32:25.007645 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:32:38 crc kubenswrapper[4865]: I1205 06:32:38.008032 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:32:38 crc kubenswrapper[4865]: E1205 06:32:38.009270 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:32:49 crc kubenswrapper[4865]: I1205 06:32:49.007864 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:32:49 crc kubenswrapper[4865]: E1205 06:32:49.009141 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:33:01 crc kubenswrapper[4865]: I1205 06:33:01.011591 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:33:01 crc kubenswrapper[4865]: E1205 06:33:01.012534 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:33:15 crc kubenswrapper[4865]: I1205 06:33:15.007171 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:33:15 crc kubenswrapper[4865]: E1205 06:33:15.009979 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:33:27 crc kubenswrapper[4865]: I1205 06:33:27.007768 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:33:27 crc kubenswrapper[4865]: E1205 06:33:27.009280 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:33:39 crc kubenswrapper[4865]: I1205 06:33:39.006933 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:33:39 crc kubenswrapper[4865]: E1205 06:33:39.007759 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:33:53 crc kubenswrapper[4865]: I1205 06:33:53.007375 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:33:53 crc kubenswrapper[4865]: E1205 06:33:53.009024 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:34:05 crc kubenswrapper[4865]: I1205 06:34:05.007604 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:34:05 crc kubenswrapper[4865]: E1205 06:34:05.008759 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:34:17 crc kubenswrapper[4865]: I1205 06:34:17.007478 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:34:17 crc kubenswrapper[4865]: E1205 06:34:17.008897 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:34:30 crc kubenswrapper[4865]: I1205 06:34:30.007556 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:34:30 crc kubenswrapper[4865]: E1205 06:34:30.008428 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:34:45 crc kubenswrapper[4865]: I1205 06:34:45.006452 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:34:45 crc kubenswrapper[4865]: E1205 06:34:45.007235 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:34:59 crc kubenswrapper[4865]: I1205 06:34:59.007129 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:34:59 crc kubenswrapper[4865]: E1205 06:34:59.008981 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:35:12 crc kubenswrapper[4865]: I1205 06:35:12.006944 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:35:13 crc kubenswrapper[4865]: I1205 06:35:13.168366 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"c8d511abdaaac858feab2a7a74288283921974991421c6d6ac314782261ff80d"} Dec 05 06:36:56 crc kubenswrapper[4865]: I1205 06:36:56.194639 4865 generic.go:334] "Generic (PLEG): container finished" podID="ef2fa284-2648-4c53-8443-e60705efb609" containerID="2dc15a6353727b9acf40a8e837ec8a76dcc1ecce2184cd127eef71687bd594fe" exitCode=0 Dec 05 06:36:56 crc kubenswrapper[4865]: I1205 06:36:56.194729 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" event={"ID":"ef2fa284-2648-4c53-8443-e60705efb609","Type":"ContainerDied","Data":"2dc15a6353727b9acf40a8e837ec8a76dcc1ecce2184cd127eef71687bd594fe"} Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.682520 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.766885 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww7pz\" (UniqueName: \"kubernetes.io/projected/ef2fa284-2648-4c53-8443-e60705efb609-kube-api-access-ww7pz\") pod \"ef2fa284-2648-4c53-8443-e60705efb609\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.767002 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-inventory\") pod \"ef2fa284-2648-4c53-8443-e60705efb609\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.767257 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-ssh-key\") pod \"ef2fa284-2648-4c53-8443-e60705efb609\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.767299 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-combined-ca-bundle\") pod \"ef2fa284-2648-4c53-8443-e60705efb609\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.767319 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-secret-0\") pod \"ef2fa284-2648-4c53-8443-e60705efb609\" (UID: \"ef2fa284-2648-4c53-8443-e60705efb609\") " Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.783496 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ef2fa284-2648-4c53-8443-e60705efb609" (UID: "ef2fa284-2648-4c53-8443-e60705efb609"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.789722 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2fa284-2648-4c53-8443-e60705efb609-kube-api-access-ww7pz" (OuterVolumeSpecName: "kube-api-access-ww7pz") pod "ef2fa284-2648-4c53-8443-e60705efb609" (UID: "ef2fa284-2648-4c53-8443-e60705efb609"). InnerVolumeSpecName "kube-api-access-ww7pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.793920 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-inventory" (OuterVolumeSpecName: "inventory") pod "ef2fa284-2648-4c53-8443-e60705efb609" (UID: "ef2fa284-2648-4c53-8443-e60705efb609"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.806238 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ef2fa284-2648-4c53-8443-e60705efb609" (UID: "ef2fa284-2648-4c53-8443-e60705efb609"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.807725 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef2fa284-2648-4c53-8443-e60705efb609" (UID: "ef2fa284-2648-4c53-8443-e60705efb609"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.869853 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.870090 4865 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.870103 4865 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.870140 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww7pz\" (UniqueName: \"kubernetes.io/projected/ef2fa284-2648-4c53-8443-e60705efb609-kube-api-access-ww7pz\") on node \"crc\" DevicePath \"\"" Dec 05 06:36:57 crc kubenswrapper[4865]: I1205 06:36:57.870149 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef2fa284-2648-4c53-8443-e60705efb609-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.216923 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" event={"ID":"ef2fa284-2648-4c53-8443-e60705efb609","Type":"ContainerDied","Data":"c17763a583a3e97f4bd3e8adb3ec1c6f95f4c71a5d9fed45b170d39f8f7dbe22"} Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.216972 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c17763a583a3e97f4bd3e8adb3ec1c6f95f4c71a5d9fed45b170d39f8f7dbe22" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.216975 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rvldt" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.333796 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8"] Dec 05 06:36:58 crc kubenswrapper[4865]: E1205 06:36:58.334485 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2fa284-2648-4c53-8443-e60705efb609" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.334516 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2fa284-2648-4c53-8443-e60705efb609" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.334875 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2fa284-2648-4c53-8443-e60705efb609" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.335815 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.342649 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.344291 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.344500 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.344600 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.344855 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.345055 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.345227 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.358783 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8"] Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.485749 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.485836 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.485875 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.486020 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.486113 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.486148 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9px\" (UniqueName: \"kubernetes.io/projected/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-kube-api-access-7s9px\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.486177 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.486233 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.486270 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588371 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588440 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588502 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588558 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588705 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588755 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s9px\" (UniqueName: \"kubernetes.io/projected/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-kube-api-access-7s9px\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588797 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588881 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.588946 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.590678 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.595261 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.595345 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.597644 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.598672 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.604313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.604624 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.606981 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s9px\" (UniqueName: \"kubernetes.io/projected/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-kube-api-access-7s9px\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.611243 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8pmn8\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:58 crc kubenswrapper[4865]: I1205 06:36:58.651946 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:36:59 crc kubenswrapper[4865]: I1205 06:36:59.213461 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:36:59 crc kubenswrapper[4865]: I1205 06:36:59.221464 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8"] Dec 05 06:36:59 crc kubenswrapper[4865]: I1205 06:36:59.228764 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" event={"ID":"4b81cc6f-f002-4a0d-911f-2aedbec17e6c","Type":"ContainerStarted","Data":"2a0070cd03a244e7ddaa12c91a67c8ab4e62b6dd803d1b78115451f1a8c22ccc"} Dec 05 06:37:00 crc kubenswrapper[4865]: I1205 06:37:00.251084 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" event={"ID":"4b81cc6f-f002-4a0d-911f-2aedbec17e6c","Type":"ContainerStarted","Data":"9d0ab3bc12121698e39b63c74f8baf5fda933c101046f7d55f72e5bf82594403"} Dec 05 06:37:00 crc kubenswrapper[4865]: I1205 06:37:00.275165 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" podStartSLOduration=1.856855109 podStartE2EDuration="2.275140772s" podCreationTimestamp="2025-12-05 06:36:58 +0000 UTC" firstStartedPulling="2025-12-05 06:36:59.213245628 +0000 UTC m=+2638.493256850" lastFinishedPulling="2025-12-05 06:36:59.631531291 +0000 UTC m=+2638.911542513" observedRunningTime="2025-12-05 06:37:00.272317672 +0000 UTC m=+2639.552328934" watchObservedRunningTime="2025-12-05 06:37:00.275140772 +0000 UTC m=+2639.555152004" Dec 05 06:37:41 crc kubenswrapper[4865]: I1205 06:37:41.049039 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:37:41 crc kubenswrapper[4865]: I1205 06:37:41.049879 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.386741 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sf7hg"] Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.389927 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.410508 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sf7hg"] Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.489988 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-catalog-content\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.490060 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-utilities\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.490125 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmjxf\" (UniqueName: \"kubernetes.io/projected/f4e13ca9-98fc-4be4-8d16-04686701c822-kube-api-access-bmjxf\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.592336 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-catalog-content\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.592730 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-utilities\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.592896 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmjxf\" (UniqueName: \"kubernetes.io/projected/f4e13ca9-98fc-4be4-8d16-04686701c822-kube-api-access-bmjxf\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.592951 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-catalog-content\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.593242 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-utilities\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.612704 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmjxf\" (UniqueName: \"kubernetes.io/projected/f4e13ca9-98fc-4be4-8d16-04686701c822-kube-api-access-bmjxf\") pod \"certified-operators-sf7hg\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:43 crc kubenswrapper[4865]: I1205 06:37:43.760390 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:44 crc kubenswrapper[4865]: I1205 06:37:44.367539 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sf7hg"] Dec 05 06:37:44 crc kubenswrapper[4865]: I1205 06:37:44.687809 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerID="a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2" exitCode=0 Dec 05 06:37:44 crc kubenswrapper[4865]: I1205 06:37:44.687920 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerDied","Data":"a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2"} Dec 05 06:37:44 crc kubenswrapper[4865]: I1205 06:37:44.688787 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerStarted","Data":"f081c33ea19cd44f3ac8ae142b2f74d0bba4985bd3c08ba5f90e6cc617ff36da"} Dec 05 06:37:45 crc kubenswrapper[4865]: I1205 06:37:45.698677 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerStarted","Data":"07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8"} Dec 05 06:37:46 crc kubenswrapper[4865]: I1205 06:37:46.709624 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerID="07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8" exitCode=0 Dec 05 06:37:46 crc kubenswrapper[4865]: I1205 06:37:46.709937 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerDied","Data":"07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8"} Dec 05 06:37:47 crc kubenswrapper[4865]: I1205 06:37:47.728180 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerStarted","Data":"c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661"} Dec 05 06:37:47 crc kubenswrapper[4865]: I1205 06:37:47.752506 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sf7hg" podStartSLOduration=2.261518859 podStartE2EDuration="4.752488512s" podCreationTimestamp="2025-12-05 06:37:43 +0000 UTC" firstStartedPulling="2025-12-05 06:37:44.689981664 +0000 UTC m=+2683.969992886" lastFinishedPulling="2025-12-05 06:37:47.180951317 +0000 UTC m=+2686.460962539" observedRunningTime="2025-12-05 06:37:47.749694863 +0000 UTC m=+2687.029706095" watchObservedRunningTime="2025-12-05 06:37:47.752488512 +0000 UTC m=+2687.032499734" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.167095 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h6wqk"] Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.169120 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.180028 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6wqk"] Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.287664 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-utilities\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.287741 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49mk\" (UniqueName: \"kubernetes.io/projected/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-kube-api-access-b49mk\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.287865 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-catalog-content\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.389359 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-catalog-content\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.389781 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-catalog-content\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.390226 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-utilities\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.390311 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49mk\" (UniqueName: \"kubernetes.io/projected/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-kube-api-access-b49mk\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.390487 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-utilities\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.419613 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49mk\" (UniqueName: \"kubernetes.io/projected/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-kube-api-access-b49mk\") pod \"redhat-operators-h6wqk\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:48 crc kubenswrapper[4865]: I1205 06:37:48.487498 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:49 crc kubenswrapper[4865]: I1205 06:37:49.047892 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6wqk"] Dec 05 06:37:49 crc kubenswrapper[4865]: W1205 06:37:49.064472 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441bd7c2_c526_4bcd_a29c_ce3e62a1918a.slice/crio-2ece326ed53d4e6860bdb192e651869194685bc407b414c339b774b9805833b6 WatchSource:0}: Error finding container 2ece326ed53d4e6860bdb192e651869194685bc407b414c339b774b9805833b6: Status 404 returned error can't find the container with id 2ece326ed53d4e6860bdb192e651869194685bc407b414c339b774b9805833b6 Dec 05 06:37:49 crc kubenswrapper[4865]: I1205 06:37:49.744569 4865 generic.go:334] "Generic (PLEG): container finished" podID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerID="9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b" exitCode=0 Dec 05 06:37:49 crc kubenswrapper[4865]: I1205 06:37:49.744612 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerDied","Data":"9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b"} Dec 05 06:37:49 crc kubenswrapper[4865]: I1205 06:37:49.744942 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerStarted","Data":"2ece326ed53d4e6860bdb192e651869194685bc407b414c339b774b9805833b6"} Dec 05 06:37:50 crc kubenswrapper[4865]: I1205 06:37:50.757440 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerStarted","Data":"8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544"} Dec 05 06:37:53 crc kubenswrapper[4865]: I1205 06:37:53.760588 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:53 crc kubenswrapper[4865]: I1205 06:37:53.761076 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:53 crc kubenswrapper[4865]: I1205 06:37:53.791903 4865 generic.go:334] "Generic (PLEG): container finished" podID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerID="8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544" exitCode=0 Dec 05 06:37:53 crc kubenswrapper[4865]: I1205 06:37:53.791958 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerDied","Data":"8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544"} Dec 05 06:37:53 crc kubenswrapper[4865]: I1205 06:37:53.824513 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:53 crc kubenswrapper[4865]: I1205 06:37:53.882624 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:54 crc kubenswrapper[4865]: I1205 06:37:54.804258 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerStarted","Data":"9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a"} Dec 05 06:37:54 crc kubenswrapper[4865]: I1205 06:37:54.840151 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h6wqk" podStartSLOduration=2.3468456140000002 podStartE2EDuration="6.840124498s" podCreationTimestamp="2025-12-05 06:37:48 +0000 UTC" firstStartedPulling="2025-12-05 06:37:49.746735834 +0000 UTC m=+2689.026747056" lastFinishedPulling="2025-12-05 06:37:54.240014698 +0000 UTC m=+2693.520025940" observedRunningTime="2025-12-05 06:37:54.831099941 +0000 UTC m=+2694.111111183" watchObservedRunningTime="2025-12-05 06:37:54.840124498 +0000 UTC m=+2694.120135720" Dec 05 06:37:54 crc kubenswrapper[4865]: I1205 06:37:54.960582 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sf7hg"] Dec 05 06:37:55 crc kubenswrapper[4865]: I1205 06:37:55.813195 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sf7hg" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="registry-server" containerID="cri-o://c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661" gracePeriod=2 Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.477193 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.592607 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-catalog-content\") pod \"f4e13ca9-98fc-4be4-8d16-04686701c822\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.593300 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmjxf\" (UniqueName: \"kubernetes.io/projected/f4e13ca9-98fc-4be4-8d16-04686701c822-kube-api-access-bmjxf\") pod \"f4e13ca9-98fc-4be4-8d16-04686701c822\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.593547 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-utilities\") pod \"f4e13ca9-98fc-4be4-8d16-04686701c822\" (UID: \"f4e13ca9-98fc-4be4-8d16-04686701c822\") " Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.594506 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-utilities" (OuterVolumeSpecName: "utilities") pod "f4e13ca9-98fc-4be4-8d16-04686701c822" (UID: "f4e13ca9-98fc-4be4-8d16-04686701c822"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.607060 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e13ca9-98fc-4be4-8d16-04686701c822-kube-api-access-bmjxf" (OuterVolumeSpecName: "kube-api-access-bmjxf") pod "f4e13ca9-98fc-4be4-8d16-04686701c822" (UID: "f4e13ca9-98fc-4be4-8d16-04686701c822"). InnerVolumeSpecName "kube-api-access-bmjxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.645275 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4e13ca9-98fc-4be4-8d16-04686701c822" (UID: "f4e13ca9-98fc-4be4-8d16-04686701c822"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.696071 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmjxf\" (UniqueName: \"kubernetes.io/projected/f4e13ca9-98fc-4be4-8d16-04686701c822-kube-api-access-bmjxf\") on node \"crc\" DevicePath \"\"" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.696104 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.696114 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e13ca9-98fc-4be4-8d16-04686701c822-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.829276 4865 generic.go:334] "Generic (PLEG): container finished" podID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerID="c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661" exitCode=0 Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.829332 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerDied","Data":"c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661"} Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.829370 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf7hg" event={"ID":"f4e13ca9-98fc-4be4-8d16-04686701c822","Type":"ContainerDied","Data":"f081c33ea19cd44f3ac8ae142b2f74d0bba4985bd3c08ba5f90e6cc617ff36da"} Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.829397 4865 scope.go:117] "RemoveContainer" containerID="c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.829590 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf7hg" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.878815 4865 scope.go:117] "RemoveContainer" containerID="07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.888932 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sf7hg"] Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.903807 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sf7hg"] Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.927917 4865 scope.go:117] "RemoveContainer" containerID="a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.969852 4865 scope.go:117] "RemoveContainer" containerID="c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661" Dec 05 06:37:56 crc kubenswrapper[4865]: E1205 06:37:56.971245 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661\": container with ID starting with c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661 not found: ID does not exist" containerID="c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.971292 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661"} err="failed to get container status \"c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661\": rpc error: code = NotFound desc = could not find container \"c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661\": container with ID starting with c62f36024e17bdbcf7c0bf450bd6458db4a620de6a6d5cb2f021e68c05df9661 not found: ID does not exist" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.971320 4865 scope.go:117] "RemoveContainer" containerID="07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8" Dec 05 06:37:56 crc kubenswrapper[4865]: E1205 06:37:56.979303 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8\": container with ID starting with 07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8 not found: ID does not exist" containerID="07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.979368 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8"} err="failed to get container status \"07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8\": rpc error: code = NotFound desc = could not find container \"07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8\": container with ID starting with 07dc26f0b36611d34965dad22ddca4190fdfaa328d460e74cdf7977d8ba121f8 not found: ID does not exist" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.979407 4865 scope.go:117] "RemoveContainer" containerID="a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2" Dec 05 06:37:56 crc kubenswrapper[4865]: E1205 06:37:56.979912 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2\": container with ID starting with a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2 not found: ID does not exist" containerID="a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2" Dec 05 06:37:56 crc kubenswrapper[4865]: I1205 06:37:56.979951 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2"} err="failed to get container status \"a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2\": rpc error: code = NotFound desc = could not find container \"a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2\": container with ID starting with a7058f2fd588bc08fd965221959922afb93550a4a2c611f7400496be440e8fe2 not found: ID does not exist" Dec 05 06:37:57 crc kubenswrapper[4865]: I1205 06:37:57.019794 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" path="/var/lib/kubelet/pods/f4e13ca9-98fc-4be4-8d16-04686701c822/volumes" Dec 05 06:37:57 crc kubenswrapper[4865]: E1205 06:37:57.081862 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e13ca9_98fc_4be4_8d16_04686701c822.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e13ca9_98fc_4be4_8d16_04686701c822.slice/crio-f081c33ea19cd44f3ac8ae142b2f74d0bba4985bd3c08ba5f90e6cc617ff36da\": RecentStats: unable to find data in memory cache]" Dec 05 06:37:58 crc kubenswrapper[4865]: I1205 06:37:58.488655 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:58 crc kubenswrapper[4865]: I1205 06:37:58.489048 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:37:59 crc kubenswrapper[4865]: I1205 06:37:59.535566 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h6wqk" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="registry-server" probeResult="failure" output=< Dec 05 06:37:59 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:37:59 crc kubenswrapper[4865]: > Dec 05 06:38:08 crc kubenswrapper[4865]: I1205 06:38:08.557063 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:38:08 crc kubenswrapper[4865]: I1205 06:38:08.611120 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:38:08 crc kubenswrapper[4865]: I1205 06:38:08.817274 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6wqk"] Dec 05 06:38:09 crc kubenswrapper[4865]: I1205 06:38:09.973515 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h6wqk" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="registry-server" containerID="cri-o://9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a" gracePeriod=2 Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.571127 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.702405 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49mk\" (UniqueName: \"kubernetes.io/projected/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-kube-api-access-b49mk\") pod \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.702930 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-utilities\") pod \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.703521 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-utilities" (OuterVolumeSpecName: "utilities") pod "441bd7c2-c526-4bcd-a29c-ce3e62a1918a" (UID: "441bd7c2-c526-4bcd-a29c-ce3e62a1918a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.703751 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-catalog-content\") pod \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\" (UID: \"441bd7c2-c526-4bcd-a29c-ce3e62a1918a\") " Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.706483 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.711155 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-kube-api-access-b49mk" (OuterVolumeSpecName: "kube-api-access-b49mk") pod "441bd7c2-c526-4bcd-a29c-ce3e62a1918a" (UID: "441bd7c2-c526-4bcd-a29c-ce3e62a1918a"). InnerVolumeSpecName "kube-api-access-b49mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.808865 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49mk\" (UniqueName: \"kubernetes.io/projected/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-kube-api-access-b49mk\") on node \"crc\" DevicePath \"\"" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.833120 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "441bd7c2-c526-4bcd-a29c-ce3e62a1918a" (UID: "441bd7c2-c526-4bcd-a29c-ce3e62a1918a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.911025 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441bd7c2-c526-4bcd-a29c-ce3e62a1918a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.987958 4865 generic.go:334] "Generic (PLEG): container finished" podID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerID="9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a" exitCode=0 Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.988037 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerDied","Data":"9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a"} Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.988065 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6wqk" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.988101 4865 scope.go:117] "RemoveContainer" containerID="9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a" Dec 05 06:38:10 crc kubenswrapper[4865]: I1205 06:38:10.988084 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6wqk" event={"ID":"441bd7c2-c526-4bcd-a29c-ce3e62a1918a","Type":"ContainerDied","Data":"2ece326ed53d4e6860bdb192e651869194685bc407b414c339b774b9805833b6"} Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.018563 4865 scope.go:117] "RemoveContainer" containerID="8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.048382 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6wqk"] Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.049425 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.049553 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.052812 4865 scope.go:117] "RemoveContainer" containerID="9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.061255 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h6wqk"] Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.092703 4865 scope.go:117] "RemoveContainer" containerID="9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a" Dec 05 06:38:11 crc kubenswrapper[4865]: E1205 06:38:11.094789 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a\": container with ID starting with 9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a not found: ID does not exist" containerID="9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.094877 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a"} err="failed to get container status \"9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a\": rpc error: code = NotFound desc = could not find container \"9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a\": container with ID starting with 9de125909156b70eb95a20a656a1c3edf06bb8f2ad4243b2a61a93544b651e3a not found: ID does not exist" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.094917 4865 scope.go:117] "RemoveContainer" containerID="8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544" Dec 05 06:38:11 crc kubenswrapper[4865]: E1205 06:38:11.095392 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544\": container with ID starting with 8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544 not found: ID does not exist" containerID="8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.095441 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544"} err="failed to get container status \"8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544\": rpc error: code = NotFound desc = could not find container \"8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544\": container with ID starting with 8344b187a20bf6ea84295ee624ffbe11622ea12f2d2f771a95f058862d8f3544 not found: ID does not exist" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.095473 4865 scope.go:117] "RemoveContainer" containerID="9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b" Dec 05 06:38:11 crc kubenswrapper[4865]: E1205 06:38:11.096115 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b\": container with ID starting with 9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b not found: ID does not exist" containerID="9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b" Dec 05 06:38:11 crc kubenswrapper[4865]: I1205 06:38:11.096146 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b"} err="failed to get container status \"9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b\": rpc error: code = NotFound desc = could not find container \"9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b\": container with ID starting with 9534b32a7266a442e3627490d33dc54daa6bd3360cf00409f56debc3bb84550b not found: ID does not exist" Dec 05 06:38:13 crc kubenswrapper[4865]: I1205 06:38:13.020345 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" path="/var/lib/kubelet/pods/441bd7c2-c526-4bcd-a29c-ce3e62a1918a/volumes" Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.049224 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.050104 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.050176 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.051314 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8d511abdaaac858feab2a7a74288283921974991421c6d6ac314782261ff80d"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.051419 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://c8d511abdaaac858feab2a7a74288283921974991421c6d6ac314782261ff80d" gracePeriod=600 Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.339087 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="c8d511abdaaac858feab2a7a74288283921974991421c6d6ac314782261ff80d" exitCode=0 Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.339144 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"c8d511abdaaac858feab2a7a74288283921974991421c6d6ac314782261ff80d"} Dec 05 06:38:41 crc kubenswrapper[4865]: I1205 06:38:41.339380 4865 scope.go:117] "RemoveContainer" containerID="0cc3e938f62badf90b6dc9eff221da7dc9edc6780c09f16365d51bf8f563638a" Dec 05 06:38:42 crc kubenswrapper[4865]: I1205 06:38:42.364948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8"} Dec 05 06:40:19 crc kubenswrapper[4865]: I1205 06:40:19.310702 4865 generic.go:334] "Generic (PLEG): container finished" podID="4b81cc6f-f002-4a0d-911f-2aedbec17e6c" containerID="9d0ab3bc12121698e39b63c74f8baf5fda933c101046f7d55f72e5bf82594403" exitCode=0 Dec 05 06:40:19 crc kubenswrapper[4865]: I1205 06:40:19.311324 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" event={"ID":"4b81cc6f-f002-4a0d-911f-2aedbec17e6c","Type":"ContainerDied","Data":"9d0ab3bc12121698e39b63c74f8baf5fda933c101046f7d55f72e5bf82594403"} Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.800806 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.848887 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-combined-ca-bundle\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849013 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-extra-config-0\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849050 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-0\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849120 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-0\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849376 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-ssh-key\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849423 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s9px\" (UniqueName: \"kubernetes.io/projected/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-kube-api-access-7s9px\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849467 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-inventory\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849585 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-1\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.849663 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-1\") pod \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\" (UID: \"4b81cc6f-f002-4a0d-911f-2aedbec17e6c\") " Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.878510 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.881967 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.885509 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-kube-api-access-7s9px" (OuterVolumeSpecName: "kube-api-access-7s9px") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "kube-api-access-7s9px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.899599 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.899616 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.899796 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.902006 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.905776 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-inventory" (OuterVolumeSpecName: "inventory") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.910969 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4b81cc6f-f002-4a0d-911f-2aedbec17e6c" (UID: "4b81cc6f-f002-4a0d-911f-2aedbec17e6c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951175 4865 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951208 4865 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951228 4865 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951238 4865 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951247 4865 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951257 4865 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951266 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951275 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s9px\" (UniqueName: \"kubernetes.io/projected/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-kube-api-access-7s9px\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:20 crc kubenswrapper[4865]: I1205 06:40:20.951285 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b81cc6f-f002-4a0d-911f-2aedbec17e6c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.332559 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" event={"ID":"4b81cc6f-f002-4a0d-911f-2aedbec17e6c","Type":"ContainerDied","Data":"2a0070cd03a244e7ddaa12c91a67c8ab4e62b6dd803d1b78115451f1a8c22ccc"} Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.332975 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a0070cd03a244e7ddaa12c91a67c8ab4e62b6dd803d1b78115451f1a8c22ccc" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.332602 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8pmn8" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.499538 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7"] Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500090 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="extract-content" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500110 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="extract-content" Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500133 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="registry-server" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500167 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="registry-server" Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500197 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="extract-utilities" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500207 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="extract-utilities" Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500223 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="extract-content" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500231 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="extract-content" Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500249 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="registry-server" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500257 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="registry-server" Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500278 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b81cc6f-f002-4a0d-911f-2aedbec17e6c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500287 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b81cc6f-f002-4a0d-911f-2aedbec17e6c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 06:40:21 crc kubenswrapper[4865]: E1205 06:40:21.500305 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="extract-utilities" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500313 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="extract-utilities" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500535 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e13ca9-98fc-4be4-8d16-04686701c822" containerName="registry-server" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500554 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b81cc6f-f002-4a0d-911f-2aedbec17e6c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.500585 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="441bd7c2-c526-4bcd-a29c-ce3e62a1918a" containerName="registry-server" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.501304 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.503613 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.504312 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.504697 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.505686 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.508668 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gtc4b" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.515655 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7"] Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.564952 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.565027 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.565087 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.565209 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pvpd\" (UniqueName: \"kubernetes.io/projected/4fe98c92-1aa9-444a-88d9-1280d7865f92-kube-api-access-7pvpd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.565310 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.565369 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.565407 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667348 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667446 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pvpd\" (UniqueName: \"kubernetes.io/projected/4fe98c92-1aa9-444a-88d9-1280d7865f92-kube-api-access-7pvpd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667495 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667527 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667559 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667646 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.667678 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.672313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.672313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.672869 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.673920 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.674433 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.678316 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.684782 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pvpd\" (UniqueName: \"kubernetes.io/projected/4fe98c92-1aa9-444a-88d9-1280d7865f92-kube-api-access-7pvpd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:21 crc kubenswrapper[4865]: I1205 06:40:21.855214 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:40:22 crc kubenswrapper[4865]: I1205 06:40:22.398732 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7"] Dec 05 06:40:23 crc kubenswrapper[4865]: I1205 06:40:23.355149 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" event={"ID":"4fe98c92-1aa9-444a-88d9-1280d7865f92","Type":"ContainerStarted","Data":"050711682ab536bed737142cca106eb6b6c6b8e25f3d80af5a3175823f7595f3"} Dec 05 06:40:23 crc kubenswrapper[4865]: I1205 06:40:23.355845 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" event={"ID":"4fe98c92-1aa9-444a-88d9-1280d7865f92","Type":"ContainerStarted","Data":"409121c1def8145e26b154051c7789c93711f8ae1daf058e9591afff5ebb7c14"} Dec 05 06:40:23 crc kubenswrapper[4865]: I1205 06:40:23.385657 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" podStartSLOduration=1.897333987 podStartE2EDuration="2.385628817s" podCreationTimestamp="2025-12-05 06:40:21 +0000 UTC" firstStartedPulling="2025-12-05 06:40:22.399278449 +0000 UTC m=+2841.679289671" lastFinishedPulling="2025-12-05 06:40:22.887573269 +0000 UTC m=+2842.167584501" observedRunningTime="2025-12-05 06:40:23.376235 +0000 UTC m=+2842.656246212" watchObservedRunningTime="2025-12-05 06:40:23.385628817 +0000 UTC m=+2842.665640079" Dec 05 06:40:41 crc kubenswrapper[4865]: I1205 06:40:41.049016 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:40:41 crc kubenswrapper[4865]: I1205 06:40:41.049626 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:41:11 crc kubenswrapper[4865]: I1205 06:41:11.049648 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:41:11 crc kubenswrapper[4865]: I1205 06:41:11.050411 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:41:41 crc kubenswrapper[4865]: I1205 06:41:41.049398 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:41:41 crc kubenswrapper[4865]: I1205 06:41:41.050111 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:41:41 crc kubenswrapper[4865]: I1205 06:41:41.050196 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:41:41 crc kubenswrapper[4865]: I1205 06:41:41.051592 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:41:41 crc kubenswrapper[4865]: I1205 06:41:41.051709 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" gracePeriod=600 Dec 05 06:41:41 crc kubenswrapper[4865]: E1205 06:41:41.178400 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:41:42 crc kubenswrapper[4865]: I1205 06:41:42.156559 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" exitCode=0 Dec 05 06:41:42 crc kubenswrapper[4865]: I1205 06:41:42.156601 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8"} Dec 05 06:41:42 crc kubenswrapper[4865]: I1205 06:41:42.156632 4865 scope.go:117] "RemoveContainer" containerID="c8d511abdaaac858feab2a7a74288283921974991421c6d6ac314782261ff80d" Dec 05 06:41:42 crc kubenswrapper[4865]: I1205 06:41:42.157246 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:41:42 crc kubenswrapper[4865]: E1205 06:41:42.157491 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:41:55 crc kubenswrapper[4865]: I1205 06:41:55.007185 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:41:55 crc kubenswrapper[4865]: E1205 06:41:55.008107 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:42:10 crc kubenswrapper[4865]: I1205 06:42:10.006872 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:42:10 crc kubenswrapper[4865]: E1205 06:42:10.007654 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:42:25 crc kubenswrapper[4865]: I1205 06:42:25.006952 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:42:25 crc kubenswrapper[4865]: E1205 06:42:25.007758 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:42:38 crc kubenswrapper[4865]: I1205 06:42:38.006446 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:42:38 crc kubenswrapper[4865]: E1205 06:42:38.007229 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:42:49 crc kubenswrapper[4865]: I1205 06:42:49.007257 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:42:49 crc kubenswrapper[4865]: E1205 06:42:49.010233 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:43:00 crc kubenswrapper[4865]: I1205 06:43:00.006301 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:43:00 crc kubenswrapper[4865]: E1205 06:43:00.006966 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:43:14 crc kubenswrapper[4865]: I1205 06:43:14.006691 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:43:14 crc kubenswrapper[4865]: E1205 06:43:14.007524 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:43:25 crc kubenswrapper[4865]: I1205 06:43:25.007001 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:43:25 crc kubenswrapper[4865]: E1205 06:43:25.007726 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.467067 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rmndk"] Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.474910 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.503525 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-catalog-content\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.503799 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpv47\" (UniqueName: \"kubernetes.io/projected/01fde809-1d90-4a6b-a6b4-1cc442235923-kube-api-access-vpv47\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.503946 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-utilities\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.541548 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmndk"] Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.606033 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-catalog-content\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.606093 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpv47\" (UniqueName: \"kubernetes.io/projected/01fde809-1d90-4a6b-a6b4-1cc442235923-kube-api-access-vpv47\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.606155 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-utilities\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.606578 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-catalog-content\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.606624 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-utilities\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.631400 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpv47\" (UniqueName: \"kubernetes.io/projected/01fde809-1d90-4a6b-a6b4-1cc442235923-kube-api-access-vpv47\") pod \"redhat-marketplace-rmndk\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:38 crc kubenswrapper[4865]: I1205 06:43:38.796468 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:39 crc kubenswrapper[4865]: I1205 06:43:39.406246 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmndk"] Dec 05 06:43:40 crc kubenswrapper[4865]: I1205 06:43:40.006878 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:43:40 crc kubenswrapper[4865]: E1205 06:43:40.007454 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:43:40 crc kubenswrapper[4865]: I1205 06:43:40.338957 4865 generic.go:334] "Generic (PLEG): container finished" podID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerID="fcbcc744ef2051e3a2c852c79a598df9dbee8a26fe01a10f88fa0480725cd5ce" exitCode=0 Dec 05 06:43:40 crc kubenswrapper[4865]: I1205 06:43:40.339040 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmndk" event={"ID":"01fde809-1d90-4a6b-a6b4-1cc442235923","Type":"ContainerDied","Data":"fcbcc744ef2051e3a2c852c79a598df9dbee8a26fe01a10f88fa0480725cd5ce"} Dec 05 06:43:40 crc kubenswrapper[4865]: I1205 06:43:40.339117 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmndk" event={"ID":"01fde809-1d90-4a6b-a6b4-1cc442235923","Type":"ContainerStarted","Data":"9a778ce101a33916955457e74128ae3084632c678c83182127b0c639d8950771"} Dec 05 06:43:40 crc kubenswrapper[4865]: I1205 06:43:40.341195 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.665178 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gn9qq"] Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.667295 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.678721 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gn9qq"] Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.769643 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sn2k\" (UniqueName: \"kubernetes.io/projected/0f217a20-d75d-488d-a70d-a24d7607832b-kube-api-access-5sn2k\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.769707 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-utilities\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.769732 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-catalog-content\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.871622 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sn2k\" (UniqueName: \"kubernetes.io/projected/0f217a20-d75d-488d-a70d-a24d7607832b-kube-api-access-5sn2k\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.871960 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-utilities\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.871983 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-catalog-content\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.872410 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-utilities\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.872473 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-catalog-content\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.894628 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sn2k\" (UniqueName: \"kubernetes.io/projected/0f217a20-d75d-488d-a70d-a24d7607832b-kube-api-access-5sn2k\") pod \"community-operators-gn9qq\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:41 crc kubenswrapper[4865]: I1205 06:43:41.989597 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:42 crc kubenswrapper[4865]: I1205 06:43:42.365601 4865 generic.go:334] "Generic (PLEG): container finished" podID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerID="59d153190f135a046b7b4b3ba4f3575d739afd8018a6f131b68cd95e2bae79db" exitCode=0 Dec 05 06:43:42 crc kubenswrapper[4865]: I1205 06:43:42.365772 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmndk" event={"ID":"01fde809-1d90-4a6b-a6b4-1cc442235923","Type":"ContainerDied","Data":"59d153190f135a046b7b4b3ba4f3575d739afd8018a6f131b68cd95e2bae79db"} Dec 05 06:43:42 crc kubenswrapper[4865]: I1205 06:43:42.525386 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gn9qq"] Dec 05 06:43:43 crc kubenswrapper[4865]: I1205 06:43:43.377909 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmndk" event={"ID":"01fde809-1d90-4a6b-a6b4-1cc442235923","Type":"ContainerStarted","Data":"df2eb54c435fe77c0faaf67e2978b8f72f2e4e44de4bc7b70893ae1768093da6"} Dec 05 06:43:43 crc kubenswrapper[4865]: I1205 06:43:43.382797 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f217a20-d75d-488d-a70d-a24d7607832b" containerID="aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4" exitCode=0 Dec 05 06:43:43 crc kubenswrapper[4865]: I1205 06:43:43.382879 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerDied","Data":"aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4"} Dec 05 06:43:43 crc kubenswrapper[4865]: I1205 06:43:43.382910 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerStarted","Data":"ad38129132703c737385a6a473f182f88ce3d6d51fea6355254d747a06e8807a"} Dec 05 06:43:43 crc kubenswrapper[4865]: I1205 06:43:43.413891 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rmndk" podStartSLOduration=3.004296503 podStartE2EDuration="5.413869473s" podCreationTimestamp="2025-12-05 06:43:38 +0000 UTC" firstStartedPulling="2025-12-05 06:43:40.340935532 +0000 UTC m=+3039.620946754" lastFinishedPulling="2025-12-05 06:43:42.750508502 +0000 UTC m=+3042.030519724" observedRunningTime="2025-12-05 06:43:43.404076965 +0000 UTC m=+3042.684088187" watchObservedRunningTime="2025-12-05 06:43:43.413869473 +0000 UTC m=+3042.693880695" Dec 05 06:43:44 crc kubenswrapper[4865]: I1205 06:43:44.392360 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerStarted","Data":"1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6"} Dec 05 06:43:45 crc kubenswrapper[4865]: I1205 06:43:45.405328 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f217a20-d75d-488d-a70d-a24d7607832b" containerID="1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6" exitCode=0 Dec 05 06:43:45 crc kubenswrapper[4865]: I1205 06:43:45.405381 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerDied","Data":"1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6"} Dec 05 06:43:47 crc kubenswrapper[4865]: I1205 06:43:47.438321 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerStarted","Data":"656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7"} Dec 05 06:43:47 crc kubenswrapper[4865]: I1205 06:43:47.472069 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gn9qq" podStartSLOduration=3.300752794 podStartE2EDuration="6.472052198s" podCreationTimestamp="2025-12-05 06:43:41 +0000 UTC" firstStartedPulling="2025-12-05 06:43:43.384429587 +0000 UTC m=+3042.664440809" lastFinishedPulling="2025-12-05 06:43:46.555728991 +0000 UTC m=+3045.835740213" observedRunningTime="2025-12-05 06:43:47.468740354 +0000 UTC m=+3046.748751586" watchObservedRunningTime="2025-12-05 06:43:47.472052198 +0000 UTC m=+3046.752063420" Dec 05 06:43:48 crc kubenswrapper[4865]: I1205 06:43:48.797938 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:48 crc kubenswrapper[4865]: I1205 06:43:48.797997 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:48 crc kubenswrapper[4865]: I1205 06:43:48.856420 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:49 crc kubenswrapper[4865]: I1205 06:43:49.508838 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:51 crc kubenswrapper[4865]: I1205 06:43:51.989917 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:51 crc kubenswrapper[4865]: I1205 06:43:51.990380 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.006952 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:43:52 crc kubenswrapper[4865]: E1205 06:43:52.007216 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.040869 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.051688 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmndk"] Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.052126 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rmndk" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="registry-server" containerID="cri-o://df2eb54c435fe77c0faaf67e2978b8f72f2e4e44de4bc7b70893ae1768093da6" gracePeriod=2 Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.486303 4865 generic.go:334] "Generic (PLEG): container finished" podID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerID="df2eb54c435fe77c0faaf67e2978b8f72f2e4e44de4bc7b70893ae1768093da6" exitCode=0 Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.488647 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmndk" event={"ID":"01fde809-1d90-4a6b-a6b4-1cc442235923","Type":"ContainerDied","Data":"df2eb54c435fe77c0faaf67e2978b8f72f2e4e44de4bc7b70893ae1768093da6"} Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.488698 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmndk" event={"ID":"01fde809-1d90-4a6b-a6b4-1cc442235923","Type":"ContainerDied","Data":"9a778ce101a33916955457e74128ae3084632c678c83182127b0c639d8950771"} Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.488736 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a778ce101a33916955457e74128ae3084632c678c83182127b0c639d8950771" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.524746 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.593331 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-catalog-content\") pod \"01fde809-1d90-4a6b-a6b4-1cc442235923\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.593465 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpv47\" (UniqueName: \"kubernetes.io/projected/01fde809-1d90-4a6b-a6b4-1cc442235923-kube-api-access-vpv47\") pod \"01fde809-1d90-4a6b-a6b4-1cc442235923\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.593678 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-utilities\") pod \"01fde809-1d90-4a6b-a6b4-1cc442235923\" (UID: \"01fde809-1d90-4a6b-a6b4-1cc442235923\") " Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.595066 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-utilities" (OuterVolumeSpecName: "utilities") pod "01fde809-1d90-4a6b-a6b4-1cc442235923" (UID: "01fde809-1d90-4a6b-a6b4-1cc442235923"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.598377 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.600885 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fde809-1d90-4a6b-a6b4-1cc442235923-kube-api-access-vpv47" (OuterVolumeSpecName: "kube-api-access-vpv47") pod "01fde809-1d90-4a6b-a6b4-1cc442235923" (UID: "01fde809-1d90-4a6b-a6b4-1cc442235923"). InnerVolumeSpecName "kube-api-access-vpv47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.639101 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01fde809-1d90-4a6b-a6b4-1cc442235923" (UID: "01fde809-1d90-4a6b-a6b4-1cc442235923"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.695816 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpv47\" (UniqueName: \"kubernetes.io/projected/01fde809-1d90-4a6b-a6b4-1cc442235923-kube-api-access-vpv47\") on node \"crc\" DevicePath \"\"" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.695866 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:43:52 crc kubenswrapper[4865]: I1205 06:43:52.695878 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fde809-1d90-4a6b-a6b4-1cc442235923-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:43:53 crc kubenswrapper[4865]: I1205 06:43:53.495690 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmndk" Dec 05 06:43:53 crc kubenswrapper[4865]: I1205 06:43:53.527184 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmndk"] Dec 05 06:43:53 crc kubenswrapper[4865]: I1205 06:43:53.542284 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmndk"] Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.456338 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gn9qq"] Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.504978 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gn9qq" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="registry-server" containerID="cri-o://656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7" gracePeriod=2 Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.920186 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.935095 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-catalog-content\") pod \"0f217a20-d75d-488d-a70d-a24d7607832b\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.935386 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-utilities\") pod \"0f217a20-d75d-488d-a70d-a24d7607832b\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.935672 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sn2k\" (UniqueName: \"kubernetes.io/projected/0f217a20-d75d-488d-a70d-a24d7607832b-kube-api-access-5sn2k\") pod \"0f217a20-d75d-488d-a70d-a24d7607832b\" (UID: \"0f217a20-d75d-488d-a70d-a24d7607832b\") " Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.936018 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-utilities" (OuterVolumeSpecName: "utilities") pod "0f217a20-d75d-488d-a70d-a24d7607832b" (UID: "0f217a20-d75d-488d-a70d-a24d7607832b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.936353 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:43:54 crc kubenswrapper[4865]: I1205 06:43:54.941060 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f217a20-d75d-488d-a70d-a24d7607832b-kube-api-access-5sn2k" (OuterVolumeSpecName: "kube-api-access-5sn2k") pod "0f217a20-d75d-488d-a70d-a24d7607832b" (UID: "0f217a20-d75d-488d-a70d-a24d7607832b"). InnerVolumeSpecName "kube-api-access-5sn2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.001362 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f217a20-d75d-488d-a70d-a24d7607832b" (UID: "0f217a20-d75d-488d-a70d-a24d7607832b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.017058 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" path="/var/lib/kubelet/pods/01fde809-1d90-4a6b-a6b4-1cc442235923/volumes" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.037621 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f217a20-d75d-488d-a70d-a24d7607832b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.037649 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sn2k\" (UniqueName: \"kubernetes.io/projected/0f217a20-d75d-488d-a70d-a24d7607832b-kube-api-access-5sn2k\") on node \"crc\" DevicePath \"\"" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.514557 4865 generic.go:334] "Generic (PLEG): container finished" podID="0f217a20-d75d-488d-a70d-a24d7607832b" containerID="656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7" exitCode=0 Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.514712 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerDied","Data":"656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7"} Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.514897 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn9qq" event={"ID":"0f217a20-d75d-488d-a70d-a24d7607832b","Type":"ContainerDied","Data":"ad38129132703c737385a6a473f182f88ce3d6d51fea6355254d747a06e8807a"} Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.514927 4865 scope.go:117] "RemoveContainer" containerID="656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.514782 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn9qq" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.548717 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gn9qq"] Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.549050 4865 scope.go:117] "RemoveContainer" containerID="1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.557183 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gn9qq"] Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.571642 4865 scope.go:117] "RemoveContainer" containerID="aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.616957 4865 scope.go:117] "RemoveContainer" containerID="656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7" Dec 05 06:43:55 crc kubenswrapper[4865]: E1205 06:43:55.617421 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7\": container with ID starting with 656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7 not found: ID does not exist" containerID="656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.617464 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7"} err="failed to get container status \"656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7\": rpc error: code = NotFound desc = could not find container \"656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7\": container with ID starting with 656efa75b042675bedf0f4e3c05c4c488108f15ed97437c1a9ce0b24cc6186c7 not found: ID does not exist" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.617489 4865 scope.go:117] "RemoveContainer" containerID="1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6" Dec 05 06:43:55 crc kubenswrapper[4865]: E1205 06:43:55.618012 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6\": container with ID starting with 1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6 not found: ID does not exist" containerID="1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.618043 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6"} err="failed to get container status \"1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6\": rpc error: code = NotFound desc = could not find container \"1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6\": container with ID starting with 1491ed6f9a50801e1c263153c2708ab7f67ec15e27820c9252948e1dab7f80d6 not found: ID does not exist" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.618063 4865 scope.go:117] "RemoveContainer" containerID="aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4" Dec 05 06:43:55 crc kubenswrapper[4865]: E1205 06:43:55.618314 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4\": container with ID starting with aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4 not found: ID does not exist" containerID="aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4" Dec 05 06:43:55 crc kubenswrapper[4865]: I1205 06:43:55.618353 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4"} err="failed to get container status \"aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4\": rpc error: code = NotFound desc = could not find container \"aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4\": container with ID starting with aa4bbbd3e7c3d0f60c892656f6fee29a06790fc6fc45fc338bbb40508128e4b4 not found: ID does not exist" Dec 05 06:43:57 crc kubenswrapper[4865]: I1205 06:43:57.028696 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" path="/var/lib/kubelet/pods/0f217a20-d75d-488d-a70d-a24d7607832b/volumes" Dec 05 06:44:01 crc kubenswrapper[4865]: I1205 06:44:01.576287 4865 generic.go:334] "Generic (PLEG): container finished" podID="4fe98c92-1aa9-444a-88d9-1280d7865f92" containerID="050711682ab536bed737142cca106eb6b6c6b8e25f3d80af5a3175823f7595f3" exitCode=0 Dec 05 06:44:01 crc kubenswrapper[4865]: I1205 06:44:01.576399 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" event={"ID":"4fe98c92-1aa9-444a-88d9-1280d7865f92","Type":"ContainerDied","Data":"050711682ab536bed737142cca106eb6b6c6b8e25f3d80af5a3175823f7595f3"} Dec 05 06:44:02 crc kubenswrapper[4865]: I1205 06:44:02.999666 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.114471 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ssh-key\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.114890 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-telemetry-combined-ca-bundle\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.115013 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-2\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.115341 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-0\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.115405 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pvpd\" (UniqueName: \"kubernetes.io/projected/4fe98c92-1aa9-444a-88d9-1280d7865f92-kube-api-access-7pvpd\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.115454 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-1\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.115489 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-inventory\") pod \"4fe98c92-1aa9-444a-88d9-1280d7865f92\" (UID: \"4fe98c92-1aa9-444a-88d9-1280d7865f92\") " Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.120061 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.136978 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe98c92-1aa9-444a-88d9-1280d7865f92-kube-api-access-7pvpd" (OuterVolumeSpecName: "kube-api-access-7pvpd") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "kube-api-access-7pvpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.145667 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.153531 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.159923 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.178009 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.178996 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-inventory" (OuterVolumeSpecName: "inventory") pod "4fe98c92-1aa9-444a-88d9-1280d7865f92" (UID: "4fe98c92-1aa9-444a-88d9-1280d7865f92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218363 4865 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218419 4865 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218434 4865 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218451 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pvpd\" (UniqueName: \"kubernetes.io/projected/4fe98c92-1aa9-444a-88d9-1280d7865f92-kube-api-access-7pvpd\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218485 4865 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218497 4865 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.218509 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe98c92-1aa9-444a-88d9-1280d7865f92-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.597879 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" event={"ID":"4fe98c92-1aa9-444a-88d9-1280d7865f92","Type":"ContainerDied","Data":"409121c1def8145e26b154051c7789c93711f8ae1daf058e9591afff5ebb7c14"} Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.597925 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="409121c1def8145e26b154051c7789c93711f8ae1daf058e9591afff5ebb7c14" Dec 05 06:44:03 crc kubenswrapper[4865]: I1205 06:44:03.597991 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7" Dec 05 06:44:04 crc kubenswrapper[4865]: I1205 06:44:04.006713 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:44:04 crc kubenswrapper[4865]: E1205 06:44:04.007305 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:44:18 crc kubenswrapper[4865]: I1205 06:44:18.005964 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:44:18 crc kubenswrapper[4865]: E1205 06:44:18.006721 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:44:30 crc kubenswrapper[4865]: I1205 06:44:30.006960 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:44:30 crc kubenswrapper[4865]: E1205 06:44:30.007681 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:44:42 crc kubenswrapper[4865]: I1205 06:44:42.007373 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:44:42 crc kubenswrapper[4865]: E1205 06:44:42.008260 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:44:57 crc kubenswrapper[4865]: I1205 06:44:57.007528 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:44:57 crc kubenswrapper[4865]: E1205 06:44:57.008715 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.424698 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.426279 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="extract-utilities" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.426314 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="extract-utilities" Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.426339 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="extract-content" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.426383 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="extract-content" Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.426424 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="extract-content" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.426447 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="extract-content" Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.427049 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="registry-server" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427074 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="registry-server" Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.427116 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="extract-utilities" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427134 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="extract-utilities" Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.427168 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe98c92-1aa9-444a-88d9-1280d7865f92" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427186 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe98c92-1aa9-444a-88d9-1280d7865f92" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 06:44:59 crc kubenswrapper[4865]: E1205 06:44:59.427218 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="registry-server" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427234 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="registry-server" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427695 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe98c92-1aa9-444a-88d9-1280d7865f92" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427759 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fde809-1d90-4a6b-a6b4-1cc442235923" containerName="registry-server" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.427799 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f217a20-d75d-488d-a70d-a24d7607832b" containerName="registry-server" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.429152 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.432626 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c9k8f" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.432865 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.434069 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.434476 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.441752 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542058 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6tgl\" (UniqueName: \"kubernetes.io/projected/564b1ff3-5b9c-4058-94b2-a488e26b27dc-kube-api-access-c6tgl\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542140 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542169 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542330 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542364 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542401 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-config-data\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542460 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542489 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.542562 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.645064 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.645199 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6tgl\" (UniqueName: \"kubernetes.io/projected/564b1ff3-5b9c-4058-94b2-a488e26b27dc-kube-api-access-c6tgl\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.645275 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.645311 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.645549 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.645583 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.646429 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.646687 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.647577 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-config-data\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.652514 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-config-data\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.652678 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.652789 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.653495 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.653638 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.654151 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.655433 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.656566 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.673848 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6tgl\" (UniqueName: \"kubernetes.io/projected/564b1ff3-5b9c-4058-94b2-a488e26b27dc-kube-api-access-c6tgl\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.689407 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " pod="openstack/tempest-tests-tempest" Dec 05 06:44:59 crc kubenswrapper[4865]: I1205 06:44:59.776484 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.138458 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw"] Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.140242 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.143441 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.144170 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.151257 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw"] Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.254426 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.292047 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85fw\" (UniqueName: \"kubernetes.io/projected/50115a27-3f10-433d-b8db-311ba3fc75ba-kube-api-access-x85fw\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.292120 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50115a27-3f10-433d-b8db-311ba3fc75ba-config-volume\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.292164 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50115a27-3f10-433d-b8db-311ba3fc75ba-secret-volume\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.393731 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x85fw\" (UniqueName: \"kubernetes.io/projected/50115a27-3f10-433d-b8db-311ba3fc75ba-kube-api-access-x85fw\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.393839 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50115a27-3f10-433d-b8db-311ba3fc75ba-config-volume\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.393905 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50115a27-3f10-433d-b8db-311ba3fc75ba-secret-volume\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.394998 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50115a27-3f10-433d-b8db-311ba3fc75ba-config-volume\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.401020 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50115a27-3f10-433d-b8db-311ba3fc75ba-secret-volume\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.411705 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85fw\" (UniqueName: \"kubernetes.io/projected/50115a27-3f10-433d-b8db-311ba3fc75ba-kube-api-access-x85fw\") pod \"collect-profiles-29415285-9m2sw\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.516871 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:00 crc kubenswrapper[4865]: I1205 06:45:00.991081 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw"] Dec 05 06:45:01 crc kubenswrapper[4865]: I1205 06:45:01.217438 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"564b1ff3-5b9c-4058-94b2-a488e26b27dc","Type":"ContainerStarted","Data":"49e1a0a6ece17732af381659bcd98fa5ab102f78c1df7f8110c0fd5df50c8636"} Dec 05 06:45:01 crc kubenswrapper[4865]: I1205 06:45:01.219589 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" event={"ID":"50115a27-3f10-433d-b8db-311ba3fc75ba","Type":"ContainerStarted","Data":"12236b5960c1b8c96c4bd0eb8e5bea26453024e34a518b3eea3924a01808c51c"} Dec 05 06:45:02 crc kubenswrapper[4865]: I1205 06:45:02.236472 4865 generic.go:334] "Generic (PLEG): container finished" podID="50115a27-3f10-433d-b8db-311ba3fc75ba" containerID="6b526418c58d2c09428230635de644556d6d992fd40d58cf422a8893cc19a913" exitCode=0 Dec 05 06:45:02 crc kubenswrapper[4865]: I1205 06:45:02.236542 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" event={"ID":"50115a27-3f10-433d-b8db-311ba3fc75ba","Type":"ContainerDied","Data":"6b526418c58d2c09428230635de644556d6d992fd40d58cf422a8893cc19a913"} Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.168712 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.300192 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" event={"ID":"50115a27-3f10-433d-b8db-311ba3fc75ba","Type":"ContainerDied","Data":"12236b5960c1b8c96c4bd0eb8e5bea26453024e34a518b3eea3924a01808c51c"} Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.300244 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415285-9m2sw" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.300258 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12236b5960c1b8c96c4bd0eb8e5bea26453024e34a518b3eea3924a01808c51c" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.363168 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50115a27-3f10-433d-b8db-311ba3fc75ba-config-volume\") pod \"50115a27-3f10-433d-b8db-311ba3fc75ba\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.363767 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50115a27-3f10-433d-b8db-311ba3fc75ba-secret-volume\") pod \"50115a27-3f10-433d-b8db-311ba3fc75ba\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.363837 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x85fw\" (UniqueName: \"kubernetes.io/projected/50115a27-3f10-433d-b8db-311ba3fc75ba-kube-api-access-x85fw\") pod \"50115a27-3f10-433d-b8db-311ba3fc75ba\" (UID: \"50115a27-3f10-433d-b8db-311ba3fc75ba\") " Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.365343 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50115a27-3f10-433d-b8db-311ba3fc75ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "50115a27-3f10-433d-b8db-311ba3fc75ba" (UID: "50115a27-3f10-433d-b8db-311ba3fc75ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.375472 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50115a27-3f10-433d-b8db-311ba3fc75ba-kube-api-access-x85fw" (OuterVolumeSpecName: "kube-api-access-x85fw") pod "50115a27-3f10-433d-b8db-311ba3fc75ba" (UID: "50115a27-3f10-433d-b8db-311ba3fc75ba"). InnerVolumeSpecName "kube-api-access-x85fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.384564 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50115a27-3f10-433d-b8db-311ba3fc75ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "50115a27-3f10-433d-b8db-311ba3fc75ba" (UID: "50115a27-3f10-433d-b8db-311ba3fc75ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.465794 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50115a27-3f10-433d-b8db-311ba3fc75ba-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.465939 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50115a27-3f10-433d-b8db-311ba3fc75ba-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 06:45:09 crc kubenswrapper[4865]: I1205 06:45:09.465952 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x85fw\" (UniqueName: \"kubernetes.io/projected/50115a27-3f10-433d-b8db-311ba3fc75ba-kube-api-access-x85fw\") on node \"crc\" DevicePath \"\"" Dec 05 06:45:10 crc kubenswrapper[4865]: I1205 06:45:10.006725 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:45:10 crc kubenswrapper[4865]: E1205 06:45:10.007056 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:45:10 crc kubenswrapper[4865]: I1205 06:45:10.258505 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4"] Dec 05 06:45:10 crc kubenswrapper[4865]: I1205 06:45:10.269942 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415240-ds7p4"] Dec 05 06:45:11 crc kubenswrapper[4865]: I1205 06:45:11.019948 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1384ef4-f086-4d99-92af-ed79b1e25ac8" path="/var/lib/kubelet/pods/c1384ef4-f086-4d99-92af-ed79b1e25ac8/volumes" Dec 05 06:45:23 crc kubenswrapper[4865]: I1205 06:45:23.006588 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:45:23 crc kubenswrapper[4865]: E1205 06:45:23.007333 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:45:34 crc kubenswrapper[4865]: E1205 06:45:34.094290 4865 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 05 06:45:34 crc kubenswrapper[4865]: E1205 06:45:34.095500 4865 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c6tgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(564b1ff3-5b9c-4058-94b2-a488e26b27dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 06:45:34 crc kubenswrapper[4865]: E1205 06:45:34.096759 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="564b1ff3-5b9c-4058-94b2-a488e26b27dc" Dec 05 06:45:34 crc kubenswrapper[4865]: E1205 06:45:34.567054 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="564b1ff3-5b9c-4058-94b2-a488e26b27dc" Dec 05 06:45:35 crc kubenswrapper[4865]: I1205 06:45:35.007583 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:45:35 crc kubenswrapper[4865]: E1205 06:45:35.008122 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:45:39 crc kubenswrapper[4865]: I1205 06:45:39.395149 4865 scope.go:117] "RemoveContainer" containerID="b06dd3f9ba6d11b579453c441f4b36af739a6c6294319197f5113532322064df" Dec 05 06:45:49 crc kubenswrapper[4865]: I1205 06:45:49.093713 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 06:45:50 crc kubenswrapper[4865]: I1205 06:45:50.006732 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:45:50 crc kubenswrapper[4865]: E1205 06:45:50.007534 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:45:50 crc kubenswrapper[4865]: I1205 06:45:50.734711 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"564b1ff3-5b9c-4058-94b2-a488e26b27dc","Type":"ContainerStarted","Data":"3a3fcfaa61b18668b7c50b718db1a67c7783a5dceccc412ddcef5e33a56be7f1"} Dec 05 06:45:50 crc kubenswrapper[4865]: I1205 06:45:50.765135 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.934364079 podStartE2EDuration="52.765101937s" podCreationTimestamp="2025-12-05 06:44:58 +0000 UTC" firstStartedPulling="2025-12-05 06:45:00.260518526 +0000 UTC m=+3119.540529748" lastFinishedPulling="2025-12-05 06:45:49.091256384 +0000 UTC m=+3168.371267606" observedRunningTime="2025-12-05 06:45:50.76132177 +0000 UTC m=+3170.041333012" watchObservedRunningTime="2025-12-05 06:45:50.765101937 +0000 UTC m=+3170.045113159" Dec 05 06:46:04 crc kubenswrapper[4865]: I1205 06:46:04.007248 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:46:04 crc kubenswrapper[4865]: E1205 06:46:04.008318 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:46:19 crc kubenswrapper[4865]: I1205 06:46:19.006995 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:46:19 crc kubenswrapper[4865]: E1205 06:46:19.008120 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:46:33 crc kubenswrapper[4865]: I1205 06:46:33.007205 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:46:33 crc kubenswrapper[4865]: E1205 06:46:33.008477 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:46:46 crc kubenswrapper[4865]: I1205 06:46:46.006358 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:46:47 crc kubenswrapper[4865]: I1205 06:46:47.226712 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"ffb7a7d60380188b204175d8d8ea12d8da0e5ef2dd245cd192b61c5283d10792"} Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.705616 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bd22p"] Dec 05 06:47:45 crc kubenswrapper[4865]: E1205 06:47:45.706590 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50115a27-3f10-433d-b8db-311ba3fc75ba" containerName="collect-profiles" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.706607 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="50115a27-3f10-433d-b8db-311ba3fc75ba" containerName="collect-profiles" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.706898 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="50115a27-3f10-433d-b8db-311ba3fc75ba" containerName="collect-profiles" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.708559 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.730069 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd22p"] Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.897635 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-utilities\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.897899 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l98ww\" (UniqueName: \"kubernetes.io/projected/3ed20a83-0124-4c33-8612-da121cb1f5bb-kube-api-access-l98ww\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.898231 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-catalog-content\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.999658 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l98ww\" (UniqueName: \"kubernetes.io/projected/3ed20a83-0124-4c33-8612-da121cb1f5bb-kube-api-access-l98ww\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.999806 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-catalog-content\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:45 crc kubenswrapper[4865]: I1205 06:47:45.999896 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-utilities\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:46 crc kubenswrapper[4865]: I1205 06:47:46.001668 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-catalog-content\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:46 crc kubenswrapper[4865]: I1205 06:47:46.001702 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-utilities\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:46 crc kubenswrapper[4865]: I1205 06:47:46.027592 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l98ww\" (UniqueName: \"kubernetes.io/projected/3ed20a83-0124-4c33-8612-da121cb1f5bb-kube-api-access-l98ww\") pod \"certified-operators-bd22p\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:46 crc kubenswrapper[4865]: I1205 06:47:46.326755 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:47 crc kubenswrapper[4865]: I1205 06:47:47.191720 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd22p"] Dec 05 06:47:47 crc kubenswrapper[4865]: I1205 06:47:47.842794 4865 generic.go:334] "Generic (PLEG): container finished" podID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerID="a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4" exitCode=0 Dec 05 06:47:47 crc kubenswrapper[4865]: I1205 06:47:47.843314 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerDied","Data":"a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4"} Dec 05 06:47:47 crc kubenswrapper[4865]: I1205 06:47:47.843364 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerStarted","Data":"7fe6eb9e5373490015d8a1059bd8403b248622750ca93bafba0a0e49b7fc6074"} Dec 05 06:47:48 crc kubenswrapper[4865]: I1205 06:47:48.854389 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerStarted","Data":"38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59"} Dec 05 06:47:50 crc kubenswrapper[4865]: I1205 06:47:50.880571 4865 generic.go:334] "Generic (PLEG): container finished" podID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerID="38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59" exitCode=0 Dec 05 06:47:50 crc kubenswrapper[4865]: I1205 06:47:50.881283 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerDied","Data":"38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59"} Dec 05 06:47:51 crc kubenswrapper[4865]: I1205 06:47:51.896960 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerStarted","Data":"46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16"} Dec 05 06:47:51 crc kubenswrapper[4865]: I1205 06:47:51.923456 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bd22p" podStartSLOduration=3.425804084 podStartE2EDuration="6.923424474s" podCreationTimestamp="2025-12-05 06:47:45 +0000 UTC" firstStartedPulling="2025-12-05 06:47:47.845376247 +0000 UTC m=+3287.125387469" lastFinishedPulling="2025-12-05 06:47:51.342996637 +0000 UTC m=+3290.623007859" observedRunningTime="2025-12-05 06:47:51.917564478 +0000 UTC m=+3291.197575700" watchObservedRunningTime="2025-12-05 06:47:51.923424474 +0000 UTC m=+3291.203435706" Dec 05 06:47:56 crc kubenswrapper[4865]: I1205 06:47:56.327621 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:56 crc kubenswrapper[4865]: I1205 06:47:56.327986 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:56 crc kubenswrapper[4865]: I1205 06:47:56.400176 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:57 crc kubenswrapper[4865]: I1205 06:47:57.112279 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:57 crc kubenswrapper[4865]: I1205 06:47:57.177419 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd22p"] Dec 05 06:47:58 crc kubenswrapper[4865]: I1205 06:47:58.953063 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bd22p" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="registry-server" containerID="cri-o://46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16" gracePeriod=2 Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.716405 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.905855 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-utilities\") pod \"3ed20a83-0124-4c33-8612-da121cb1f5bb\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.905991 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-catalog-content\") pod \"3ed20a83-0124-4c33-8612-da121cb1f5bb\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.906074 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l98ww\" (UniqueName: \"kubernetes.io/projected/3ed20a83-0124-4c33-8612-da121cb1f5bb-kube-api-access-l98ww\") pod \"3ed20a83-0124-4c33-8612-da121cb1f5bb\" (UID: \"3ed20a83-0124-4c33-8612-da121cb1f5bb\") " Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.907687 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-utilities" (OuterVolumeSpecName: "utilities") pod "3ed20a83-0124-4c33-8612-da121cb1f5bb" (UID: "3ed20a83-0124-4c33-8612-da121cb1f5bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.924258 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed20a83-0124-4c33-8612-da121cb1f5bb-kube-api-access-l98ww" (OuterVolumeSpecName: "kube-api-access-l98ww") pod "3ed20a83-0124-4c33-8612-da121cb1f5bb" (UID: "3ed20a83-0124-4c33-8612-da121cb1f5bb"). InnerVolumeSpecName "kube-api-access-l98ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.977505 4865 generic.go:334] "Generic (PLEG): container finished" podID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerID="46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16" exitCode=0 Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.977564 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerDied","Data":"46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16"} Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.977591 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd22p" event={"ID":"3ed20a83-0124-4c33-8612-da121cb1f5bb","Type":"ContainerDied","Data":"7fe6eb9e5373490015d8a1059bd8403b248622750ca93bafba0a0e49b7fc6074"} Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.977612 4865 scope.go:117] "RemoveContainer" containerID="46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16" Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.977777 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd22p" Dec 05 06:47:59 crc kubenswrapper[4865]: I1205 06:47:59.992190 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ed20a83-0124-4c33-8612-da121cb1f5bb" (UID: "3ed20a83-0124-4c33-8612-da121cb1f5bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.005043 4865 scope.go:117] "RemoveContainer" containerID="38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.011168 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.011201 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed20a83-0124-4c33-8612-da121cb1f5bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.011214 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l98ww\" (UniqueName: \"kubernetes.io/projected/3ed20a83-0124-4c33-8612-da121cb1f5bb-kube-api-access-l98ww\") on node \"crc\" DevicePath \"\"" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.039989 4865 scope.go:117] "RemoveContainer" containerID="a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.076528 4865 scope.go:117] "RemoveContainer" containerID="46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16" Dec 05 06:48:00 crc kubenswrapper[4865]: E1205 06:48:00.078248 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16\": container with ID starting with 46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16 not found: ID does not exist" containerID="46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.078304 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16"} err="failed to get container status \"46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16\": rpc error: code = NotFound desc = could not find container \"46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16\": container with ID starting with 46453b5e4264e3d5c4e31515cd8167c628c77083265aaf0bb7eb7e38ac178b16 not found: ID does not exist" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.078344 4865 scope.go:117] "RemoveContainer" containerID="38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59" Dec 05 06:48:00 crc kubenswrapper[4865]: E1205 06:48:00.088513 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59\": container with ID starting with 38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59 not found: ID does not exist" containerID="38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.088567 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59"} err="failed to get container status \"38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59\": rpc error: code = NotFound desc = could not find container \"38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59\": container with ID starting with 38b7e79bcac2a70c61097baf4a006f9c8be0876a259e14de79cdda7b685ddf59 not found: ID does not exist" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.088603 4865 scope.go:117] "RemoveContainer" containerID="a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4" Dec 05 06:48:00 crc kubenswrapper[4865]: E1205 06:48:00.089399 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4\": container with ID starting with a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4 not found: ID does not exist" containerID="a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.089469 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4"} err="failed to get container status \"a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4\": rpc error: code = NotFound desc = could not find container \"a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4\": container with ID starting with a11d9d1fe3f451fabf551b7961a26fca5d6a18b3b8b4ebc1ea24f111b0f768d4 not found: ID does not exist" Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.323795 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd22p"] Dec 05 06:48:00 crc kubenswrapper[4865]: I1205 06:48:00.331671 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bd22p"] Dec 05 06:48:01 crc kubenswrapper[4865]: I1205 06:48:01.039161 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" path="/var/lib/kubelet/pods/3ed20a83-0124-4c33-8612-da121cb1f5bb/volumes" Dec 05 06:49:11 crc kubenswrapper[4865]: I1205 06:49:11.049387 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:49:11 crc kubenswrapper[4865]: I1205 06:49:11.050185 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.910838 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fcnj2"] Dec 05 06:49:15 crc kubenswrapper[4865]: E1205 06:49:15.911779 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="registry-server" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.911794 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="registry-server" Dec 05 06:49:15 crc kubenswrapper[4865]: E1205 06:49:15.911818 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="extract-content" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.911922 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="extract-content" Dec 05 06:49:15 crc kubenswrapper[4865]: E1205 06:49:15.911959 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="extract-utilities" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.911965 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="extract-utilities" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.912136 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed20a83-0124-4c33-8612-da121cb1f5bb" containerName="registry-server" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.913552 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.937051 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcnj2"] Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.973086 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdfmn\" (UniqueName: \"kubernetes.io/projected/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-kube-api-access-gdfmn\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.973204 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-catalog-content\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:15 crc kubenswrapper[4865]: I1205 06:49:15.973257 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-utilities\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.073882 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-catalog-content\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.073965 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-utilities\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.074032 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdfmn\" (UniqueName: \"kubernetes.io/projected/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-kube-api-access-gdfmn\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.074451 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-catalog-content\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.074692 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-utilities\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.105295 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdfmn\" (UniqueName: \"kubernetes.io/projected/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-kube-api-access-gdfmn\") pod \"redhat-operators-fcnj2\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.235235 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:16 crc kubenswrapper[4865]: I1205 06:49:16.841934 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcnj2"] Dec 05 06:49:17 crc kubenswrapper[4865]: I1205 06:49:17.817897 4865 generic.go:334] "Generic (PLEG): container finished" podID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerID="c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd" exitCode=0 Dec 05 06:49:17 crc kubenswrapper[4865]: I1205 06:49:17.818132 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerDied","Data":"c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd"} Dec 05 06:49:17 crc kubenswrapper[4865]: I1205 06:49:17.818364 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerStarted","Data":"88767cd4f78991d162f126e9ccef0237fbddd07c8a411907cd5c5c2cc5f6cdbb"} Dec 05 06:49:17 crc kubenswrapper[4865]: I1205 06:49:17.928189 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:49:18 crc kubenswrapper[4865]: I1205 06:49:18.829558 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerStarted","Data":"40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5"} Dec 05 06:49:21 crc kubenswrapper[4865]: I1205 06:49:21.884220 4865 generic.go:334] "Generic (PLEG): container finished" podID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerID="40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5" exitCode=0 Dec 05 06:49:21 crc kubenswrapper[4865]: I1205 06:49:21.884448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerDied","Data":"40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5"} Dec 05 06:49:22 crc kubenswrapper[4865]: I1205 06:49:22.894965 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerStarted","Data":"09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef"} Dec 05 06:49:22 crc kubenswrapper[4865]: I1205 06:49:22.917005 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fcnj2" podStartSLOduration=3.485662497 podStartE2EDuration="7.916983467s" podCreationTimestamp="2025-12-05 06:49:15 +0000 UTC" firstStartedPulling="2025-12-05 06:49:17.84176982 +0000 UTC m=+3377.121781042" lastFinishedPulling="2025-12-05 06:49:22.27309078 +0000 UTC m=+3381.553102012" observedRunningTime="2025-12-05 06:49:22.91671758 +0000 UTC m=+3382.196728802" watchObservedRunningTime="2025-12-05 06:49:22.916983467 +0000 UTC m=+3382.196994689" Dec 05 06:49:26 crc kubenswrapper[4865]: I1205 06:49:26.236297 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:26 crc kubenswrapper[4865]: I1205 06:49:26.236908 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:27 crc kubenswrapper[4865]: I1205 06:49:27.293362 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fcnj2" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="registry-server" probeResult="failure" output=< Dec 05 06:49:27 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:49:27 crc kubenswrapper[4865]: > Dec 05 06:49:36 crc kubenswrapper[4865]: I1205 06:49:36.306109 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:36 crc kubenswrapper[4865]: I1205 06:49:36.361903 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:36 crc kubenswrapper[4865]: I1205 06:49:36.549204 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcnj2"] Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.029086 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fcnj2" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="registry-server" containerID="cri-o://09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef" gracePeriod=2 Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.571134 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.704230 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-utilities\") pod \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.704684 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdfmn\" (UniqueName: \"kubernetes.io/projected/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-kube-api-access-gdfmn\") pod \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.704848 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-catalog-content\") pod \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\" (UID: \"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082\") " Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.705229 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-utilities" (OuterVolumeSpecName: "utilities") pod "e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" (UID: "e90e39af-b3f0-4aa7-bf60-0d2cdfc05082"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.705669 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.710590 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-kube-api-access-gdfmn" (OuterVolumeSpecName: "kube-api-access-gdfmn") pod "e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" (UID: "e90e39af-b3f0-4aa7-bf60-0d2cdfc05082"). InnerVolumeSpecName "kube-api-access-gdfmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.808725 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdfmn\" (UniqueName: \"kubernetes.io/projected/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-kube-api-access-gdfmn\") on node \"crc\" DevicePath \"\"" Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.828093 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" (UID: "e90e39af-b3f0-4aa7-bf60-0d2cdfc05082"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:49:38 crc kubenswrapper[4865]: I1205 06:49:38.910785 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.041856 4865 generic.go:334] "Generic (PLEG): container finished" podID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerID="09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef" exitCode=0 Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.041899 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerDied","Data":"09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef"} Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.041931 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcnj2" event={"ID":"e90e39af-b3f0-4aa7-bf60-0d2cdfc05082","Type":"ContainerDied","Data":"88767cd4f78991d162f126e9ccef0237fbddd07c8a411907cd5c5c2cc5f6cdbb"} Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.041949 4865 scope.go:117] "RemoveContainer" containerID="09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.042089 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcnj2" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.066334 4865 scope.go:117] "RemoveContainer" containerID="40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.069407 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcnj2"] Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.078978 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fcnj2"] Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.090948 4865 scope.go:117] "RemoveContainer" containerID="c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.153497 4865 scope.go:117] "RemoveContainer" containerID="09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef" Dec 05 06:49:39 crc kubenswrapper[4865]: E1205 06:49:39.154428 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef\": container with ID starting with 09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef not found: ID does not exist" containerID="09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.154461 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef"} err="failed to get container status \"09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef\": rpc error: code = NotFound desc = could not find container \"09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef\": container with ID starting with 09cd1b6729a714e9ad0e9a942583fabeb3bfa89468dacc60504d92ce623253ef not found: ID does not exist" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.154486 4865 scope.go:117] "RemoveContainer" containerID="40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5" Dec 05 06:49:39 crc kubenswrapper[4865]: E1205 06:49:39.155057 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5\": container with ID starting with 40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5 not found: ID does not exist" containerID="40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.155088 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5"} err="failed to get container status \"40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5\": rpc error: code = NotFound desc = could not find container \"40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5\": container with ID starting with 40016df7f26f3540b69b7ad60658856b67a84100b44d100194e859ab3bc689a5 not found: ID does not exist" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.155103 4865 scope.go:117] "RemoveContainer" containerID="c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd" Dec 05 06:49:39 crc kubenswrapper[4865]: E1205 06:49:39.156030 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd\": container with ID starting with c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd not found: ID does not exist" containerID="c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.156086 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd"} err="failed to get container status \"c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd\": rpc error: code = NotFound desc = could not find container \"c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd\": container with ID starting with c3a465685e53f3c1a265becbff1b5bf09df5e6ee821621a90371b5e4889bd4fd not found: ID does not exist" Dec 05 06:49:39 crc kubenswrapper[4865]: I1205 06:49:39.653544 4865 scope.go:117] "RemoveContainer" containerID="fcbcc744ef2051e3a2c852c79a598df9dbee8a26fe01a10f88fa0480725cd5ce" Dec 05 06:49:41 crc kubenswrapper[4865]: I1205 06:49:41.019210 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" path="/var/lib/kubelet/pods/e90e39af-b3f0-4aa7-bf60-0d2cdfc05082/volumes" Dec 05 06:49:41 crc kubenswrapper[4865]: I1205 06:49:41.049350 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:49:41 crc kubenswrapper[4865]: I1205 06:49:41.049415 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.049415 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.050257 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.050305 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.051124 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffb7a7d60380188b204175d8d8ea12d8da0e5ef2dd245cd192b61c5283d10792"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.051177 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://ffb7a7d60380188b204175d8d8ea12d8da0e5ef2dd245cd192b61c5283d10792" gracePeriod=600 Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.409758 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="ffb7a7d60380188b204175d8d8ea12d8da0e5ef2dd245cd192b61c5283d10792" exitCode=0 Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.409940 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"ffb7a7d60380188b204175d8d8ea12d8da0e5ef2dd245cd192b61c5283d10792"} Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.410348 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203"} Dec 05 06:50:11 crc kubenswrapper[4865]: I1205 06:50:11.410382 4865 scope.go:117] "RemoveContainer" containerID="22a596096dc0035f8aacc4e6f08da2853c69bade8c217ca305f12cf624a8f7d8" Dec 05 06:50:39 crc kubenswrapper[4865]: I1205 06:50:39.706348 4865 scope.go:117] "RemoveContainer" containerID="59d153190f135a046b7b4b3ba4f3575d739afd8018a6f131b68cd95e2bae79db" Dec 05 06:50:39 crc kubenswrapper[4865]: I1205 06:50:39.735463 4865 scope.go:117] "RemoveContainer" containerID="df2eb54c435fe77c0faaf67e2978b8f72f2e4e44de4bc7b70893ae1768093da6" Dec 05 06:52:11 crc kubenswrapper[4865]: I1205 06:52:11.052490 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:52:11 crc kubenswrapper[4865]: I1205 06:52:11.053255 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:52:41 crc kubenswrapper[4865]: I1205 06:52:41.049487 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:52:41 crc kubenswrapper[4865]: I1205 06:52:41.050038 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.050097 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.050880 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.050968 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.051935 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.052004 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" gracePeriod=600 Dec 05 06:53:11 crc kubenswrapper[4865]: E1205 06:53:11.186747 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.250113 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" exitCode=0 Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.250159 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203"} Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.250192 4865 scope.go:117] "RemoveContainer" containerID="ffb7a7d60380188b204175d8d8ea12d8da0e5ef2dd245cd192b61c5283d10792" Dec 05 06:53:11 crc kubenswrapper[4865]: I1205 06:53:11.250886 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:53:11 crc kubenswrapper[4865]: E1205 06:53:11.251123 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:53:23 crc kubenswrapper[4865]: I1205 06:53:23.006299 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:53:23 crc kubenswrapper[4865]: E1205 06:53:23.006981 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:53:38 crc kubenswrapper[4865]: I1205 06:53:38.007011 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:53:38 crc kubenswrapper[4865]: E1205 06:53:38.007986 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:53:53 crc kubenswrapper[4865]: I1205 06:53:53.006498 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:53:53 crc kubenswrapper[4865]: E1205 06:53:53.007225 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:54:08 crc kubenswrapper[4865]: I1205 06:54:08.006571 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:54:08 crc kubenswrapper[4865]: E1205 06:54:08.007354 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:54:22 crc kubenswrapper[4865]: I1205 06:54:22.006558 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:54:22 crc kubenswrapper[4865]: E1205 06:54:22.008331 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:54:37 crc kubenswrapper[4865]: I1205 06:54:37.010635 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:54:37 crc kubenswrapper[4865]: E1205 06:54:37.011488 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:54:48 crc kubenswrapper[4865]: I1205 06:54:48.007467 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:54:48 crc kubenswrapper[4865]: E1205 06:54:48.008503 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:55:00 crc kubenswrapper[4865]: I1205 06:55:00.006820 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:55:00 crc kubenswrapper[4865]: E1205 06:55:00.007599 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:55:15 crc kubenswrapper[4865]: I1205 06:55:15.007658 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:55:15 crc kubenswrapper[4865]: E1205 06:55:15.010524 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:55:29 crc kubenswrapper[4865]: I1205 06:55:29.006627 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:55:29 crc kubenswrapper[4865]: E1205 06:55:29.007334 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:55:44 crc kubenswrapper[4865]: I1205 06:55:44.006670 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:55:44 crc kubenswrapper[4865]: E1205 06:55:44.007452 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:55:55 crc kubenswrapper[4865]: I1205 06:55:55.006707 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:55:55 crc kubenswrapper[4865]: E1205 06:55:55.007558 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:56:09 crc kubenswrapper[4865]: I1205 06:56:09.007294 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:56:09 crc kubenswrapper[4865]: E1205 06:56:09.008306 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:56:21 crc kubenswrapper[4865]: I1205 06:56:21.007322 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:56:21 crc kubenswrapper[4865]: E1205 06:56:21.007957 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:56:36 crc kubenswrapper[4865]: I1205 06:56:36.006879 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:56:36 crc kubenswrapper[4865]: E1205 06:56:36.007965 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:56:51 crc kubenswrapper[4865]: I1205 06:56:51.014035 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:56:51 crc kubenswrapper[4865]: E1205 06:56:51.015157 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:57:03 crc kubenswrapper[4865]: I1205 06:57:03.006628 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:57:03 crc kubenswrapper[4865]: E1205 06:57:03.007310 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:57:17 crc kubenswrapper[4865]: I1205 06:57:17.006989 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:57:17 crc kubenswrapper[4865]: E1205 06:57:17.008205 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:57:28 crc kubenswrapper[4865]: I1205 06:57:28.006366 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:57:28 crc kubenswrapper[4865]: E1205 06:57:28.007528 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:57:42 crc kubenswrapper[4865]: I1205 06:57:42.006888 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:57:42 crc kubenswrapper[4865]: E1205 06:57:42.008862 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:57:54 crc kubenswrapper[4865]: I1205 06:57:54.006848 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:57:54 crc kubenswrapper[4865]: E1205 06:57:54.007722 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:58:08 crc kubenswrapper[4865]: I1205 06:58:08.006557 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:58:08 crc kubenswrapper[4865]: E1205 06:58:08.007240 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.043051 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bg6ns"] Dec 05 06:58:15 crc kubenswrapper[4865]: E1205 06:58:15.044168 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="registry-server" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.044188 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="registry-server" Dec 05 06:58:15 crc kubenswrapper[4865]: E1205 06:58:15.044217 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="extract-utilities" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.044226 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="extract-utilities" Dec 05 06:58:15 crc kubenswrapper[4865]: E1205 06:58:15.044265 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="extract-content" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.044274 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="extract-content" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.044517 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90e39af-b3f0-4aa7-bf60-0d2cdfc05082" containerName="registry-server" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.049777 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.067249 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg6ns"] Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.179616 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7b2\" (UniqueName: \"kubernetes.io/projected/f26a3d5e-08ea-4582-859b-da68fd37d9e9-kube-api-access-hn7b2\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.179664 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-catalog-content\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.179743 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-utilities\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.281136 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-utilities\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.281627 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7b2\" (UniqueName: \"kubernetes.io/projected/f26a3d5e-08ea-4582-859b-da68fd37d9e9-kube-api-access-hn7b2\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.281773 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-catalog-content\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.282313 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-catalog-content\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.281668 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-utilities\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.327775 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7b2\" (UniqueName: \"kubernetes.io/projected/f26a3d5e-08ea-4582-859b-da68fd37d9e9-kube-api-access-hn7b2\") pod \"certified-operators-bg6ns\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:15 crc kubenswrapper[4865]: I1205 06:58:15.369619 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:16 crc kubenswrapper[4865]: I1205 06:58:16.085270 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg6ns"] Dec 05 06:58:16 crc kubenswrapper[4865]: I1205 06:58:16.136266 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerStarted","Data":"956e0a15f87e9182c1eece17fb83ea906beff45c15b1c594428e963876bff0e3"} Dec 05 06:58:17 crc kubenswrapper[4865]: I1205 06:58:17.306886 4865 generic.go:334] "Generic (PLEG): container finished" podID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerID="3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0" exitCode=0 Dec 05 06:58:17 crc kubenswrapper[4865]: I1205 06:58:17.307335 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerDied","Data":"3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0"} Dec 05 06:58:17 crc kubenswrapper[4865]: I1205 06:58:17.310503 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 06:58:18 crc kubenswrapper[4865]: I1205 06:58:18.318905 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerStarted","Data":"59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573"} Dec 05 06:58:19 crc kubenswrapper[4865]: I1205 06:58:19.347548 4865 generic.go:334] "Generic (PLEG): container finished" podID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerID="59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573" exitCode=0 Dec 05 06:58:19 crc kubenswrapper[4865]: I1205 06:58:19.347636 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerDied","Data":"59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573"} Dec 05 06:58:20 crc kubenswrapper[4865]: I1205 06:58:20.359390 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerStarted","Data":"4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7"} Dec 05 06:58:20 crc kubenswrapper[4865]: I1205 06:58:20.377390 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bg6ns" podStartSLOduration=2.944908257 podStartE2EDuration="5.377367429s" podCreationTimestamp="2025-12-05 06:58:15 +0000 UTC" firstStartedPulling="2025-12-05 06:58:17.310234741 +0000 UTC m=+3916.590245963" lastFinishedPulling="2025-12-05 06:58:19.742693913 +0000 UTC m=+3919.022705135" observedRunningTime="2025-12-05 06:58:20.376552746 +0000 UTC m=+3919.656563988" watchObservedRunningTime="2025-12-05 06:58:20.377367429 +0000 UTC m=+3919.657378651" Dec 05 06:58:23 crc kubenswrapper[4865]: I1205 06:58:23.007032 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 06:58:23 crc kubenswrapper[4865]: I1205 06:58:23.386924 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"af6ec7858844e06383504664e3428c363dcae3dff9a3af262236079433a1d744"} Dec 05 06:58:25 crc kubenswrapper[4865]: I1205 06:58:25.370253 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:25 crc kubenswrapper[4865]: I1205 06:58:25.373098 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:25 crc kubenswrapper[4865]: I1205 06:58:25.446988 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:25 crc kubenswrapper[4865]: I1205 06:58:25.537274 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:25 crc kubenswrapper[4865]: I1205 06:58:25.693777 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg6ns"] Dec 05 06:58:27 crc kubenswrapper[4865]: I1205 06:58:27.424323 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bg6ns" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="registry-server" containerID="cri-o://4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7" gracePeriod=2 Dec 05 06:58:27 crc kubenswrapper[4865]: I1205 06:58:27.983742 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.148408 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-utilities\") pod \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.148870 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-catalog-content\") pod \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.149036 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn7b2\" (UniqueName: \"kubernetes.io/projected/f26a3d5e-08ea-4582-859b-da68fd37d9e9-kube-api-access-hn7b2\") pod \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\" (UID: \"f26a3d5e-08ea-4582-859b-da68fd37d9e9\") " Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.150284 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-utilities" (OuterVolumeSpecName: "utilities") pod "f26a3d5e-08ea-4582-859b-da68fd37d9e9" (UID: "f26a3d5e-08ea-4582-859b-da68fd37d9e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.161001 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26a3d5e-08ea-4582-859b-da68fd37d9e9-kube-api-access-hn7b2" (OuterVolumeSpecName: "kube-api-access-hn7b2") pod "f26a3d5e-08ea-4582-859b-da68fd37d9e9" (UID: "f26a3d5e-08ea-4582-859b-da68fd37d9e9"). InnerVolumeSpecName "kube-api-access-hn7b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.200544 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f26a3d5e-08ea-4582-859b-da68fd37d9e9" (UID: "f26a3d5e-08ea-4582-859b-da68fd37d9e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.251453 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.251479 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn7b2\" (UniqueName: \"kubernetes.io/projected/f26a3d5e-08ea-4582-859b-da68fd37d9e9-kube-api-access-hn7b2\") on node \"crc\" DevicePath \"\"" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.251489 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a3d5e-08ea-4582-859b-da68fd37d9e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.434132 4865 generic.go:334] "Generic (PLEG): container finished" podID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerID="4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7" exitCode=0 Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.434177 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerDied","Data":"4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7"} Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.434201 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg6ns" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.434211 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg6ns" event={"ID":"f26a3d5e-08ea-4582-859b-da68fd37d9e9","Type":"ContainerDied","Data":"956e0a15f87e9182c1eece17fb83ea906beff45c15b1c594428e963876bff0e3"} Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.434236 4865 scope.go:117] "RemoveContainer" containerID="4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.472274 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg6ns"] Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.480514 4865 scope.go:117] "RemoveContainer" containerID="59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.484569 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bg6ns"] Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.511532 4865 scope.go:117] "RemoveContainer" containerID="3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.550278 4865 scope.go:117] "RemoveContainer" containerID="4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7" Dec 05 06:58:28 crc kubenswrapper[4865]: E1205 06:58:28.550754 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7\": container with ID starting with 4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7 not found: ID does not exist" containerID="4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.550797 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7"} err="failed to get container status \"4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7\": rpc error: code = NotFound desc = could not find container \"4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7\": container with ID starting with 4ce487aa66c593e4dd5af40c94f097a98703036e5e7575d7635f7f6fee6ae8a7 not found: ID does not exist" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.550868 4865 scope.go:117] "RemoveContainer" containerID="59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573" Dec 05 06:58:28 crc kubenswrapper[4865]: E1205 06:58:28.551196 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573\": container with ID starting with 59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573 not found: ID does not exist" containerID="59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.551240 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573"} err="failed to get container status \"59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573\": rpc error: code = NotFound desc = could not find container \"59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573\": container with ID starting with 59d401bc4c03d7a1a5e373d6a633b44c9dffbbd0c4c67db1ade85f13c6ae7573 not found: ID does not exist" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.551270 4865 scope.go:117] "RemoveContainer" containerID="3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0" Dec 05 06:58:28 crc kubenswrapper[4865]: E1205 06:58:28.551566 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0\": container with ID starting with 3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0 not found: ID does not exist" containerID="3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0" Dec 05 06:58:28 crc kubenswrapper[4865]: I1205 06:58:28.551603 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0"} err="failed to get container status \"3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0\": rpc error: code = NotFound desc = could not find container \"3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0\": container with ID starting with 3b68ec93cbbb707a4933584e752a825f135749cf0887e3913cba9f63f2c487c0 not found: ID does not exist" Dec 05 06:58:29 crc kubenswrapper[4865]: I1205 06:58:29.019010 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" path="/var/lib/kubelet/pods/f26a3d5e-08ea-4582-859b-da68fd37d9e9/volumes" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.480798 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7tx99"] Dec 05 06:58:51 crc kubenswrapper[4865]: E1205 06:58:51.482734 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="extract-utilities" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.482782 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="extract-utilities" Dec 05 06:58:51 crc kubenswrapper[4865]: E1205 06:58:51.482887 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="registry-server" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.482904 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="registry-server" Dec 05 06:58:51 crc kubenswrapper[4865]: E1205 06:58:51.482969 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="extract-content" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.482983 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="extract-content" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.483632 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26a3d5e-08ea-4582-859b-da68fd37d9e9" containerName="registry-server" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.490992 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.501060 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7tx99"] Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.631031 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t29wg\" (UniqueName: \"kubernetes.io/projected/3b5926b0-6318-4464-a4f5-80facbda8f37-kube-api-access-t29wg\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.631166 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-catalog-content\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.631527 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-utilities\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.732871 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t29wg\" (UniqueName: \"kubernetes.io/projected/3b5926b0-6318-4464-a4f5-80facbda8f37-kube-api-access-t29wg\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.733236 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-catalog-content\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.733359 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-utilities\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.733932 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-catalog-content\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.733980 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-utilities\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.762781 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t29wg\" (UniqueName: \"kubernetes.io/projected/3b5926b0-6318-4464-a4f5-80facbda8f37-kube-api-access-t29wg\") pod \"redhat-marketplace-7tx99\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:51 crc kubenswrapper[4865]: I1205 06:58:51.825963 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:58:52 crc kubenswrapper[4865]: I1205 06:58:52.291057 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7tx99"] Dec 05 06:58:52 crc kubenswrapper[4865]: I1205 06:58:52.682191 4865 generic.go:334] "Generic (PLEG): container finished" podID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerID="cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee" exitCode=0 Dec 05 06:58:52 crc kubenswrapper[4865]: I1205 06:58:52.682238 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerDied","Data":"cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee"} Dec 05 06:58:52 crc kubenswrapper[4865]: I1205 06:58:52.682267 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerStarted","Data":"23a5e327e8db8347f20d401fe43537b9d94eb23d13b6c473b52ddb6a83167ba9"} Dec 05 06:58:53 crc kubenswrapper[4865]: I1205 06:58:53.696597 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerStarted","Data":"6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb"} Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.666658 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9ds9"] Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.670242 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.686784 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9ds9"] Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.698072 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-catalog-content\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.698457 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4cz\" (UniqueName: \"kubernetes.io/projected/c8a201b8-4813-4526-926a-51d3f4e99c51-kube-api-access-ps4cz\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.698577 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-utilities\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.734664 4865 generic.go:334] "Generic (PLEG): container finished" podID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerID="6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb" exitCode=0 Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.734746 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerDied","Data":"6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb"} Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.800318 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4cz\" (UniqueName: \"kubernetes.io/projected/c8a201b8-4813-4526-926a-51d3f4e99c51-kube-api-access-ps4cz\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.800407 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-utilities\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.800474 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-catalog-content\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.801138 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-catalog-content\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.801534 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-utilities\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:54 crc kubenswrapper[4865]: I1205 06:58:54.833231 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4cz\" (UniqueName: \"kubernetes.io/projected/c8a201b8-4813-4526-926a-51d3f4e99c51-kube-api-access-ps4cz\") pod \"community-operators-b9ds9\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:55 crc kubenswrapper[4865]: I1205 06:58:55.041430 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:58:55 crc kubenswrapper[4865]: I1205 06:58:55.618160 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9ds9"] Dec 05 06:58:55 crc kubenswrapper[4865]: W1205 06:58:55.619403 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8a201b8_4813_4526_926a_51d3f4e99c51.slice/crio-be3553d39bd3d3c7425a9fc022d5812fcd2d7dc7ee69f9f0874ac3d07f44112e WatchSource:0}: Error finding container be3553d39bd3d3c7425a9fc022d5812fcd2d7dc7ee69f9f0874ac3d07f44112e: Status 404 returned error can't find the container with id be3553d39bd3d3c7425a9fc022d5812fcd2d7dc7ee69f9f0874ac3d07f44112e Dec 05 06:58:55 crc kubenswrapper[4865]: I1205 06:58:55.744448 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerStarted","Data":"1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82"} Dec 05 06:58:55 crc kubenswrapper[4865]: I1205 06:58:55.747948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerStarted","Data":"be3553d39bd3d3c7425a9fc022d5812fcd2d7dc7ee69f9f0874ac3d07f44112e"} Dec 05 06:58:55 crc kubenswrapper[4865]: I1205 06:58:55.770474 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7tx99" podStartSLOduration=2.039288348 podStartE2EDuration="4.770454835s" podCreationTimestamp="2025-12-05 06:58:51 +0000 UTC" firstStartedPulling="2025-12-05 06:58:52.684538317 +0000 UTC m=+3951.964549549" lastFinishedPulling="2025-12-05 06:58:55.415704814 +0000 UTC m=+3954.695716036" observedRunningTime="2025-12-05 06:58:55.760677239 +0000 UTC m=+3955.040688471" watchObservedRunningTime="2025-12-05 06:58:55.770454835 +0000 UTC m=+3955.050466057" Dec 05 06:58:56 crc kubenswrapper[4865]: I1205 06:58:56.756145 4865 generic.go:334] "Generic (PLEG): container finished" podID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerID="b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a" exitCode=0 Dec 05 06:58:56 crc kubenswrapper[4865]: I1205 06:58:56.756193 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerDied","Data":"b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a"} Dec 05 06:58:57 crc kubenswrapper[4865]: I1205 06:58:57.771514 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerStarted","Data":"a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3"} Dec 05 06:58:59 crc kubenswrapper[4865]: I1205 06:58:59.812727 4865 generic.go:334] "Generic (PLEG): container finished" podID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerID="a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3" exitCode=0 Dec 05 06:58:59 crc kubenswrapper[4865]: I1205 06:58:59.812813 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerDied","Data":"a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3"} Dec 05 06:59:00 crc kubenswrapper[4865]: I1205 06:59:00.828638 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerStarted","Data":"a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc"} Dec 05 06:59:00 crc kubenswrapper[4865]: I1205 06:59:00.859745 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9ds9" podStartSLOduration=3.371669695 podStartE2EDuration="6.859724339s" podCreationTimestamp="2025-12-05 06:58:54 +0000 UTC" firstStartedPulling="2025-12-05 06:58:56.758124894 +0000 UTC m=+3956.038136116" lastFinishedPulling="2025-12-05 06:59:00.246179518 +0000 UTC m=+3959.526190760" observedRunningTime="2025-12-05 06:59:00.850430717 +0000 UTC m=+3960.130441959" watchObservedRunningTime="2025-12-05 06:59:00.859724339 +0000 UTC m=+3960.139735561" Dec 05 06:59:01 crc kubenswrapper[4865]: I1205 06:59:01.827393 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:59:01 crc kubenswrapper[4865]: I1205 06:59:01.828764 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:59:01 crc kubenswrapper[4865]: I1205 06:59:01.875243 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:59:02 crc kubenswrapper[4865]: I1205 06:59:02.918486 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:59:04 crc kubenswrapper[4865]: I1205 06:59:04.856314 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7tx99"] Dec 05 06:59:04 crc kubenswrapper[4865]: I1205 06:59:04.877665 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7tx99" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="registry-server" containerID="cri-o://1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82" gracePeriod=2 Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.041905 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.042139 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.091948 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.606025 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.730439 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-catalog-content\") pod \"3b5926b0-6318-4464-a4f5-80facbda8f37\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.730870 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-utilities\") pod \"3b5926b0-6318-4464-a4f5-80facbda8f37\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.731042 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t29wg\" (UniqueName: \"kubernetes.io/projected/3b5926b0-6318-4464-a4f5-80facbda8f37-kube-api-access-t29wg\") pod \"3b5926b0-6318-4464-a4f5-80facbda8f37\" (UID: \"3b5926b0-6318-4464-a4f5-80facbda8f37\") " Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.731764 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-utilities" (OuterVolumeSpecName: "utilities") pod "3b5926b0-6318-4464-a4f5-80facbda8f37" (UID: "3b5926b0-6318-4464-a4f5-80facbda8f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.743176 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5926b0-6318-4464-a4f5-80facbda8f37-kube-api-access-t29wg" (OuterVolumeSpecName: "kube-api-access-t29wg") pod "3b5926b0-6318-4464-a4f5-80facbda8f37" (UID: "3b5926b0-6318-4464-a4f5-80facbda8f37"). InnerVolumeSpecName "kube-api-access-t29wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.747506 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b5926b0-6318-4464-a4f5-80facbda8f37" (UID: "3b5926b0-6318-4464-a4f5-80facbda8f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.833402 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.833446 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t29wg\" (UniqueName: \"kubernetes.io/projected/3b5926b0-6318-4464-a4f5-80facbda8f37-kube-api-access-t29wg\") on node \"crc\" DevicePath \"\"" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.833456 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5926b0-6318-4464-a4f5-80facbda8f37-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.889968 4865 generic.go:334] "Generic (PLEG): container finished" podID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerID="1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82" exitCode=0 Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.890124 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerDied","Data":"1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82"} Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.890192 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7tx99" event={"ID":"3b5926b0-6318-4464-a4f5-80facbda8f37","Type":"ContainerDied","Data":"23a5e327e8db8347f20d401fe43537b9d94eb23d13b6c473b52ddb6a83167ba9"} Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.890225 4865 scope.go:117] "RemoveContainer" containerID="1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.890145 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7tx99" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.934107 4865 scope.go:117] "RemoveContainer" containerID="6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.938522 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7tx99"] Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.950630 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.964300 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7tx99"] Dec 05 06:59:05 crc kubenswrapper[4865]: I1205 06:59:05.966656 4865 scope.go:117] "RemoveContainer" containerID="cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee" Dec 05 06:59:06 crc kubenswrapper[4865]: I1205 06:59:06.009097 4865 scope.go:117] "RemoveContainer" containerID="1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82" Dec 05 06:59:06 crc kubenswrapper[4865]: E1205 06:59:06.009525 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82\": container with ID starting with 1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82 not found: ID does not exist" containerID="1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82" Dec 05 06:59:06 crc kubenswrapper[4865]: I1205 06:59:06.009570 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82"} err="failed to get container status \"1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82\": rpc error: code = NotFound desc = could not find container \"1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82\": container with ID starting with 1207f3e7648f6fb2ca9a6f4b4da3c15a20c015ed4a7fd5a1b0dde1c8a9a71e82 not found: ID does not exist" Dec 05 06:59:06 crc kubenswrapper[4865]: I1205 06:59:06.009595 4865 scope.go:117] "RemoveContainer" containerID="6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb" Dec 05 06:59:06 crc kubenswrapper[4865]: E1205 06:59:06.010729 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb\": container with ID starting with 6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb not found: ID does not exist" containerID="6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb" Dec 05 06:59:06 crc kubenswrapper[4865]: I1205 06:59:06.010763 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb"} err="failed to get container status \"6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb\": rpc error: code = NotFound desc = could not find container \"6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb\": container with ID starting with 6c9b305647a493ba4501db5dcea9a70e87fc98f3be2982bbfd16b7e5f211dbbb not found: ID does not exist" Dec 05 06:59:06 crc kubenswrapper[4865]: I1205 06:59:06.010782 4865 scope.go:117] "RemoveContainer" containerID="cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee" Dec 05 06:59:06 crc kubenswrapper[4865]: E1205 06:59:06.011311 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee\": container with ID starting with cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee not found: ID does not exist" containerID="cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee" Dec 05 06:59:06 crc kubenswrapper[4865]: I1205 06:59:06.011343 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee"} err="failed to get container status \"cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee\": rpc error: code = NotFound desc = could not find container \"cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee\": container with ID starting with cfd4922b9075180a8c3cc458b38ee5177f9e8c179955b7e0dcdb06112449c3ee not found: ID does not exist" Dec 05 06:59:07 crc kubenswrapper[4865]: I1205 06:59:07.015888 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" path="/var/lib/kubelet/pods/3b5926b0-6318-4464-a4f5-80facbda8f37/volumes" Dec 05 06:59:07 crc kubenswrapper[4865]: I1205 06:59:07.452112 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9ds9"] Dec 05 06:59:07 crc kubenswrapper[4865]: I1205 06:59:07.910111 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9ds9" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="registry-server" containerID="cri-o://a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc" gracePeriod=2 Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.615887 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.688806 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps4cz\" (UniqueName: \"kubernetes.io/projected/c8a201b8-4813-4526-926a-51d3f4e99c51-kube-api-access-ps4cz\") pod \"c8a201b8-4813-4526-926a-51d3f4e99c51\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.688915 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-catalog-content\") pod \"c8a201b8-4813-4526-926a-51d3f4e99c51\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.688944 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-utilities\") pod \"c8a201b8-4813-4526-926a-51d3f4e99c51\" (UID: \"c8a201b8-4813-4526-926a-51d3f4e99c51\") " Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.689972 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-utilities" (OuterVolumeSpecName: "utilities") pod "c8a201b8-4813-4526-926a-51d3f4e99c51" (UID: "c8a201b8-4813-4526-926a-51d3f4e99c51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.701409 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a201b8-4813-4526-926a-51d3f4e99c51-kube-api-access-ps4cz" (OuterVolumeSpecName: "kube-api-access-ps4cz") pod "c8a201b8-4813-4526-926a-51d3f4e99c51" (UID: "c8a201b8-4813-4526-926a-51d3f4e99c51"). InnerVolumeSpecName "kube-api-access-ps4cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.754247 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8a201b8-4813-4526-926a-51d3f4e99c51" (UID: "c8a201b8-4813-4526-926a-51d3f4e99c51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.791421 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps4cz\" (UniqueName: \"kubernetes.io/projected/c8a201b8-4813-4526-926a-51d3f4e99c51-kube-api-access-ps4cz\") on node \"crc\" DevicePath \"\"" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.791451 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.791461 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a201b8-4813-4526-926a-51d3f4e99c51-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.920190 4865 generic.go:334] "Generic (PLEG): container finished" podID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerID="a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc" exitCode=0 Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.920237 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerDied","Data":"a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc"} Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.920264 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9ds9" event={"ID":"c8a201b8-4813-4526-926a-51d3f4e99c51","Type":"ContainerDied","Data":"be3553d39bd3d3c7425a9fc022d5812fcd2d7dc7ee69f9f0874ac3d07f44112e"} Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.920278 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9ds9" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.920282 4865 scope.go:117] "RemoveContainer" containerID="a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.946147 4865 scope.go:117] "RemoveContainer" containerID="a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.972605 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9ds9"] Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.979143 4865 scope.go:117] "RemoveContainer" containerID="b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a" Dec 05 06:59:08 crc kubenswrapper[4865]: I1205 06:59:08.986525 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9ds9"] Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.017793 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" path="/var/lib/kubelet/pods/c8a201b8-4813-4526-926a-51d3f4e99c51/volumes" Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.021357 4865 scope.go:117] "RemoveContainer" containerID="a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc" Dec 05 06:59:09 crc kubenswrapper[4865]: E1205 06:59:09.021731 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc\": container with ID starting with a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc not found: ID does not exist" containerID="a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc" Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.021763 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc"} err="failed to get container status \"a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc\": rpc error: code = NotFound desc = could not find container \"a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc\": container with ID starting with a060926ddad58dd354c310ce2aec0e25860eb3b654ae54700d662415c64f27fc not found: ID does not exist" Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.021786 4865 scope.go:117] "RemoveContainer" containerID="a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3" Dec 05 06:59:09 crc kubenswrapper[4865]: E1205 06:59:09.022031 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3\": container with ID starting with a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3 not found: ID does not exist" containerID="a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3" Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.022054 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3"} err="failed to get container status \"a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3\": rpc error: code = NotFound desc = could not find container \"a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3\": container with ID starting with a09ea982a627151cb4dfb9f55d036f4ae01aa8fc21a649ae94cc5063e463bea3 not found: ID does not exist" Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.022071 4865 scope.go:117] "RemoveContainer" containerID="b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a" Dec 05 06:59:09 crc kubenswrapper[4865]: E1205 06:59:09.022313 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a\": container with ID starting with b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a not found: ID does not exist" containerID="b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a" Dec 05 06:59:09 crc kubenswrapper[4865]: I1205 06:59:09.022335 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a"} err="failed to get container status \"b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a\": rpc error: code = NotFound desc = could not find container \"b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a\": container with ID starting with b09019ddd937bf16ccadf51a39c41bf09e87d3376a81c88fd0e8debcfb6da79a not found: ID does not exist" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.711165 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4t88q"] Dec 05 06:59:38 crc kubenswrapper[4865]: E1205 06:59:38.712019 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="extract-utilities" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712032 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="extract-utilities" Dec 05 06:59:38 crc kubenswrapper[4865]: E1205 06:59:38.712041 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="registry-server" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712046 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="registry-server" Dec 05 06:59:38 crc kubenswrapper[4865]: E1205 06:59:38.712067 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="extract-utilities" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712073 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="extract-utilities" Dec 05 06:59:38 crc kubenswrapper[4865]: E1205 06:59:38.712094 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="extract-content" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712100 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="extract-content" Dec 05 06:59:38 crc kubenswrapper[4865]: E1205 06:59:38.712112 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="registry-server" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712117 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="registry-server" Dec 05 06:59:38 crc kubenswrapper[4865]: E1205 06:59:38.712131 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="extract-content" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712136 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="extract-content" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712300 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5926b0-6318-4464-a4f5-80facbda8f37" containerName="registry-server" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.712311 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a201b8-4813-4526-926a-51d3f4e99c51" containerName="registry-server" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.714230 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.740765 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4t88q"] Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.761077 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-catalog-content\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.761125 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5csc\" (UniqueName: \"kubernetes.io/projected/f6023ceb-e486-4e6b-baa8-5d79166df65b-kube-api-access-l5csc\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.761178 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-utilities\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.862630 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-catalog-content\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.862676 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5csc\" (UniqueName: \"kubernetes.io/projected/f6023ceb-e486-4e6b-baa8-5d79166df65b-kube-api-access-l5csc\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.862713 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-utilities\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.863174 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-catalog-content\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.863189 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-utilities\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:38 crc kubenswrapper[4865]: I1205 06:59:38.891184 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5csc\" (UniqueName: \"kubernetes.io/projected/f6023ceb-e486-4e6b-baa8-5d79166df65b-kube-api-access-l5csc\") pod \"redhat-operators-4t88q\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:39 crc kubenswrapper[4865]: I1205 06:59:39.034277 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:39 crc kubenswrapper[4865]: I1205 06:59:39.534306 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4t88q"] Dec 05 06:59:39 crc kubenswrapper[4865]: W1205 06:59:39.543972 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6023ceb_e486_4e6b_baa8_5d79166df65b.slice/crio-639a8f0fe6c2d22a0a0169e29eb34f175a0de96449f108f2e66ccb27af5eea2e WatchSource:0}: Error finding container 639a8f0fe6c2d22a0a0169e29eb34f175a0de96449f108f2e66ccb27af5eea2e: Status 404 returned error can't find the container with id 639a8f0fe6c2d22a0a0169e29eb34f175a0de96449f108f2e66ccb27af5eea2e Dec 05 06:59:40 crc kubenswrapper[4865]: I1205 06:59:40.344392 4865 generic.go:334] "Generic (PLEG): container finished" podID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerID="060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64" exitCode=0 Dec 05 06:59:40 crc kubenswrapper[4865]: I1205 06:59:40.344494 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerDied","Data":"060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64"} Dec 05 06:59:40 crc kubenswrapper[4865]: I1205 06:59:40.345604 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerStarted","Data":"639a8f0fe6c2d22a0a0169e29eb34f175a0de96449f108f2e66ccb27af5eea2e"} Dec 05 06:59:42 crc kubenswrapper[4865]: I1205 06:59:42.363305 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerStarted","Data":"d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8"} Dec 05 06:59:46 crc kubenswrapper[4865]: I1205 06:59:46.406319 4865 generic.go:334] "Generic (PLEG): container finished" podID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerID="d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8" exitCode=0 Dec 05 06:59:46 crc kubenswrapper[4865]: I1205 06:59:46.406421 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerDied","Data":"d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8"} Dec 05 06:59:47 crc kubenswrapper[4865]: I1205 06:59:47.417649 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerStarted","Data":"58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2"} Dec 05 06:59:49 crc kubenswrapper[4865]: I1205 06:59:49.034651 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:49 crc kubenswrapper[4865]: I1205 06:59:49.035978 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:50 crc kubenswrapper[4865]: I1205 06:59:50.108429 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4t88q" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="registry-server" probeResult="failure" output=< Dec 05 06:59:50 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 06:59:50 crc kubenswrapper[4865]: > Dec 05 06:59:59 crc kubenswrapper[4865]: I1205 06:59:59.080698 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:59 crc kubenswrapper[4865]: I1205 06:59:59.112279 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4t88q" podStartSLOduration=14.507263587 podStartE2EDuration="21.112257932s" podCreationTimestamp="2025-12-05 06:59:38 +0000 UTC" firstStartedPulling="2025-12-05 06:59:40.346728245 +0000 UTC m=+3999.626739477" lastFinishedPulling="2025-12-05 06:59:46.95172259 +0000 UTC m=+4006.231733822" observedRunningTime="2025-12-05 06:59:47.440768034 +0000 UTC m=+4006.720779276" watchObservedRunningTime="2025-12-05 06:59:59.112257932 +0000 UTC m=+4018.392269154" Dec 05 06:59:59 crc kubenswrapper[4865]: I1205 06:59:59.134427 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 06:59:59 crc kubenswrapper[4865]: I1205 06:59:59.318279 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4t88q"] Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.196244 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r"] Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.198234 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.203672 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.206945 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r"] Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.207541 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.304428 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-secret-volume\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.304901 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldfj\" (UniqueName: \"kubernetes.io/projected/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-kube-api-access-vldfj\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.305129 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-config-volume\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.407155 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-config-volume\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.407560 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-secret-volume\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.407742 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vldfj\" (UniqueName: \"kubernetes.io/projected/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-kube-api-access-vldfj\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.407994 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-config-volume\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.416473 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-secret-volume\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.429509 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldfj\" (UniqueName: \"kubernetes.io/projected/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-kube-api-access-vldfj\") pod \"collect-profiles-29415300-8c29r\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.523412 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:00 crc kubenswrapper[4865]: I1205 07:00:00.523920 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4t88q" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="registry-server" containerID="cri-o://58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2" gracePeriod=2 Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.035672 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r"] Dec 05 07:00:01 crc kubenswrapper[4865]: W1205 07:00:01.070462 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd6fb57_2f25_4971_acc2_6d047a2a5a64.slice/crio-5d9693fd03d651ae2ce0b65271504e10d3f02dd7ef7b2bd07a30ddb43ffa7a80 WatchSource:0}: Error finding container 5d9693fd03d651ae2ce0b65271504e10d3f02dd7ef7b2bd07a30ddb43ffa7a80: Status 404 returned error can't find the container with id 5d9693fd03d651ae2ce0b65271504e10d3f02dd7ef7b2bd07a30ddb43ffa7a80 Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.225123 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.326052 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5csc\" (UniqueName: \"kubernetes.io/projected/f6023ceb-e486-4e6b-baa8-5d79166df65b-kube-api-access-l5csc\") pod \"f6023ceb-e486-4e6b-baa8-5d79166df65b\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.326111 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-utilities\") pod \"f6023ceb-e486-4e6b-baa8-5d79166df65b\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.326152 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-catalog-content\") pod \"f6023ceb-e486-4e6b-baa8-5d79166df65b\" (UID: \"f6023ceb-e486-4e6b-baa8-5d79166df65b\") " Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.332698 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-utilities" (OuterVolumeSpecName: "utilities") pod "f6023ceb-e486-4e6b-baa8-5d79166df65b" (UID: "f6023ceb-e486-4e6b-baa8-5d79166df65b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.338743 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6023ceb-e486-4e6b-baa8-5d79166df65b-kube-api-access-l5csc" (OuterVolumeSpecName: "kube-api-access-l5csc") pod "f6023ceb-e486-4e6b-baa8-5d79166df65b" (UID: "f6023ceb-e486-4e6b-baa8-5d79166df65b"). InnerVolumeSpecName "kube-api-access-l5csc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.428972 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5csc\" (UniqueName: \"kubernetes.io/projected/f6023ceb-e486-4e6b-baa8-5d79166df65b-kube-api-access-l5csc\") on node \"crc\" DevicePath \"\"" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.429008 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.475444 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6023ceb-e486-4e6b-baa8-5d79166df65b" (UID: "f6023ceb-e486-4e6b-baa8-5d79166df65b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.530579 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6023ceb-e486-4e6b-baa8-5d79166df65b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.536696 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" event={"ID":"fbd6fb57-2f25-4971-acc2-6d047a2a5a64","Type":"ContainerStarted","Data":"00fab88724d59d344e5f9cf4f2edd7f1041cdcb21221655b6602e84eb7a35928"} Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.536747 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" event={"ID":"fbd6fb57-2f25-4971-acc2-6d047a2a5a64","Type":"ContainerStarted","Data":"5d9693fd03d651ae2ce0b65271504e10d3f02dd7ef7b2bd07a30ddb43ffa7a80"} Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.540872 4865 generic.go:334] "Generic (PLEG): container finished" podID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerID="58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2" exitCode=0 Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.540926 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerDied","Data":"58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2"} Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.540948 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t88q" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.540979 4865 scope.go:117] "RemoveContainer" containerID="58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.540965 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t88q" event={"ID":"f6023ceb-e486-4e6b-baa8-5d79166df65b","Type":"ContainerDied","Data":"639a8f0fe6c2d22a0a0169e29eb34f175a0de96449f108f2e66ccb27af5eea2e"} Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.563650 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" podStartSLOduration=1.5636309769999999 podStartE2EDuration="1.563630977s" podCreationTimestamp="2025-12-05 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 07:00:01.555897018 +0000 UTC m=+4020.835908240" watchObservedRunningTime="2025-12-05 07:00:01.563630977 +0000 UTC m=+4020.843642199" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.574895 4865 scope.go:117] "RemoveContainer" containerID="d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.585563 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4t88q"] Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.596871 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4t88q"] Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.600693 4865 scope.go:117] "RemoveContainer" containerID="060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.696262 4865 scope.go:117] "RemoveContainer" containerID="58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2" Dec 05 07:00:01 crc kubenswrapper[4865]: E1205 07:00:01.696888 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2\": container with ID starting with 58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2 not found: ID does not exist" containerID="58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.696928 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2"} err="failed to get container status \"58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2\": rpc error: code = NotFound desc = could not find container \"58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2\": container with ID starting with 58bce7be90fb27ae3a1f15e7d845955730c1c2bb8b60c8f24ff108d4a4c0b8f2 not found: ID does not exist" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.696957 4865 scope.go:117] "RemoveContainer" containerID="d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8" Dec 05 07:00:01 crc kubenswrapper[4865]: E1205 07:00:01.697312 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8\": container with ID starting with d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8 not found: ID does not exist" containerID="d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.697342 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8"} err="failed to get container status \"d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8\": rpc error: code = NotFound desc = could not find container \"d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8\": container with ID starting with d00769cef3066e0eca41ca2cbfc041268f90cf20e1244e42cf3f064a2f2000e8 not found: ID does not exist" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.697361 4865 scope.go:117] "RemoveContainer" containerID="060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64" Dec 05 07:00:01 crc kubenswrapper[4865]: E1205 07:00:01.697681 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64\": container with ID starting with 060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64 not found: ID does not exist" containerID="060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64" Dec 05 07:00:01 crc kubenswrapper[4865]: I1205 07:00:01.697702 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64"} err="failed to get container status \"060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64\": rpc error: code = NotFound desc = could not find container \"060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64\": container with ID starting with 060182603e1c84e7bea44eed74070f537e842af0644d38d99f2ce75aede0ad64 not found: ID does not exist" Dec 05 07:00:02 crc kubenswrapper[4865]: I1205 07:00:02.553286 4865 generic.go:334] "Generic (PLEG): container finished" podID="fbd6fb57-2f25-4971-acc2-6d047a2a5a64" containerID="00fab88724d59d344e5f9cf4f2edd7f1041cdcb21221655b6602e84eb7a35928" exitCode=0 Dec 05 07:00:02 crc kubenswrapper[4865]: I1205 07:00:02.553377 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" event={"ID":"fbd6fb57-2f25-4971-acc2-6d047a2a5a64","Type":"ContainerDied","Data":"00fab88724d59d344e5f9cf4f2edd7f1041cdcb21221655b6602e84eb7a35928"} Dec 05 07:00:03 crc kubenswrapper[4865]: I1205 07:00:03.017159 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" path="/var/lib/kubelet/pods/f6023ceb-e486-4e6b-baa8-5d79166df65b/volumes" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.358635 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.386100 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-secret-volume\") pod \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.386184 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vldfj\" (UniqueName: \"kubernetes.io/projected/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-kube-api-access-vldfj\") pod \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.386374 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-config-volume\") pod \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\" (UID: \"fbd6fb57-2f25-4971-acc2-6d047a2a5a64\") " Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.387640 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-config-volume" (OuterVolumeSpecName: "config-volume") pod "fbd6fb57-2f25-4971-acc2-6d047a2a5a64" (UID: "fbd6fb57-2f25-4971-acc2-6d047a2a5a64"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.425590 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fbd6fb57-2f25-4971-acc2-6d047a2a5a64" (UID: "fbd6fb57-2f25-4971-acc2-6d047a2a5a64"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.426537 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-kube-api-access-vldfj" (OuterVolumeSpecName: "kube-api-access-vldfj") pod "fbd6fb57-2f25-4971-acc2-6d047a2a5a64" (UID: "fbd6fb57-2f25-4971-acc2-6d047a2a5a64"). InnerVolumeSpecName "kube-api-access-vldfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.488383 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.488418 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vldfj\" (UniqueName: \"kubernetes.io/projected/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-kube-api-access-vldfj\") on node \"crc\" DevicePath \"\"" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.488428 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbd6fb57-2f25-4971-acc2-6d047a2a5a64-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.573019 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" event={"ID":"fbd6fb57-2f25-4971-acc2-6d047a2a5a64","Type":"ContainerDied","Data":"5d9693fd03d651ae2ce0b65271504e10d3f02dd7ef7b2bd07a30ddb43ffa7a80"} Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.573065 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9693fd03d651ae2ce0b65271504e10d3f02dd7ef7b2bd07a30ddb43ffa7a80" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.573068 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415300-8c29r" Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.639515 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv"] Dec 05 07:00:04 crc kubenswrapper[4865]: I1205 07:00:04.647166 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415255-c89vv"] Dec 05 07:00:05 crc kubenswrapper[4865]: I1205 07:00:05.016498 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88440d1b-88fe-4432-a09d-df871904d502" path="/var/lib/kubelet/pods/88440d1b-88fe-4432-a09d-df871904d502/volumes" Dec 05 07:00:40 crc kubenswrapper[4865]: I1205 07:00:40.048484 4865 scope.go:117] "RemoveContainer" containerID="9cf6a947e61a32670b42daa890ce63a443c01d59a593e3f1cdbc031c5047aaa3" Dec 05 07:00:41 crc kubenswrapper[4865]: I1205 07:00:41.050278 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:00:41 crc kubenswrapper[4865]: I1205 07:00:41.050581 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.164484 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29415301-frs8q"] Dec 05 07:01:00 crc kubenswrapper[4865]: E1205 07:01:00.165443 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd6fb57-2f25-4971-acc2-6d047a2a5a64" containerName="collect-profiles" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.165463 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd6fb57-2f25-4971-acc2-6d047a2a5a64" containerName="collect-profiles" Dec 05 07:01:00 crc kubenswrapper[4865]: E1205 07:01:00.165489 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="extract-content" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.165497 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="extract-content" Dec 05 07:01:00 crc kubenswrapper[4865]: E1205 07:01:00.165539 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="registry-server" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.165550 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="registry-server" Dec 05 07:01:00 crc kubenswrapper[4865]: E1205 07:01:00.165567 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="extract-utilities" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.165575 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="extract-utilities" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.165865 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd6fb57-2f25-4971-acc2-6d047a2a5a64" containerName="collect-profiles" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.165889 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6023ceb-e486-4e6b-baa8-5d79166df65b" containerName="registry-server" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.166681 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.183236 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415301-frs8q"] Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.306757 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-fernet-keys\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.306827 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-combined-ca-bundle\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.306978 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltbkj\" (UniqueName: \"kubernetes.io/projected/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-kube-api-access-ltbkj\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.307033 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-config-data\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.408484 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-fernet-keys\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.408568 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-combined-ca-bundle\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.408650 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltbkj\" (UniqueName: \"kubernetes.io/projected/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-kube-api-access-ltbkj\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.408727 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-config-data\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.424960 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-combined-ca-bundle\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.425303 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-config-data\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.449943 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltbkj\" (UniqueName: \"kubernetes.io/projected/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-kube-api-access-ltbkj\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.458636 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-fernet-keys\") pod \"keystone-cron-29415301-frs8q\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.489693 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:00 crc kubenswrapper[4865]: I1205 07:01:00.967813 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415301-frs8q"] Dec 05 07:01:01 crc kubenswrapper[4865]: I1205 07:01:01.191090 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415301-frs8q" event={"ID":"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0","Type":"ContainerStarted","Data":"f16fff35f0024138b3d99fbc43152722392361e45ea513d7e768e6ad7a766883"} Dec 05 07:01:02 crc kubenswrapper[4865]: I1205 07:01:02.202039 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415301-frs8q" event={"ID":"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0","Type":"ContainerStarted","Data":"c52ff31038b2cf3cefc4a69d39c9de1375e6bfbe964da5309f2576e4575d4940"} Dec 05 07:01:02 crc kubenswrapper[4865]: I1205 07:01:02.227420 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29415301-frs8q" podStartSLOduration=2.227398046 podStartE2EDuration="2.227398046s" podCreationTimestamp="2025-12-05 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 07:01:02.222266581 +0000 UTC m=+4081.502277813" watchObservedRunningTime="2025-12-05 07:01:02.227398046 +0000 UTC m=+4081.507409268" Dec 05 07:01:04 crc kubenswrapper[4865]: I1205 07:01:04.220739 4865 generic.go:334] "Generic (PLEG): container finished" podID="20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" containerID="c52ff31038b2cf3cefc4a69d39c9de1375e6bfbe964da5309f2576e4575d4940" exitCode=0 Dec 05 07:01:04 crc kubenswrapper[4865]: I1205 07:01:04.220817 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415301-frs8q" event={"ID":"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0","Type":"ContainerDied","Data":"c52ff31038b2cf3cefc4a69d39c9de1375e6bfbe964da5309f2576e4575d4940"} Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.680265 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.812490 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltbkj\" (UniqueName: \"kubernetes.io/projected/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-kube-api-access-ltbkj\") pod \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.812586 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-combined-ca-bundle\") pod \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.812727 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-fernet-keys\") pod \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.812802 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-config-data\") pod \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\" (UID: \"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0\") " Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.832548 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" (UID: "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.832639 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-kube-api-access-ltbkj" (OuterVolumeSpecName: "kube-api-access-ltbkj") pod "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" (UID: "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0"). InnerVolumeSpecName "kube-api-access-ltbkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.863062 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" (UID: "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.885659 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-config-data" (OuterVolumeSpecName: "config-data") pod "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" (UID: "20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.920927 4865 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.920982 4865 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.920995 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 07:01:05 crc kubenswrapper[4865]: I1205 07:01:05.921008 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltbkj\" (UniqueName: \"kubernetes.io/projected/20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0-kube-api-access-ltbkj\") on node \"crc\" DevicePath \"\"" Dec 05 07:01:06 crc kubenswrapper[4865]: I1205 07:01:06.239651 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415301-frs8q" event={"ID":"20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0","Type":"ContainerDied","Data":"f16fff35f0024138b3d99fbc43152722392361e45ea513d7e768e6ad7a766883"} Dec 05 07:01:06 crc kubenswrapper[4865]: I1205 07:01:06.239695 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f16fff35f0024138b3d99fbc43152722392361e45ea513d7e768e6ad7a766883" Dec 05 07:01:06 crc kubenswrapper[4865]: I1205 07:01:06.239726 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415301-frs8q" Dec 05 07:01:11 crc kubenswrapper[4865]: I1205 07:01:11.048853 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:01:11 crc kubenswrapper[4865]: I1205 07:01:11.049291 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:01:14 crc kubenswrapper[4865]: I1205 07:01:14.725380 4865 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fzpfrb" podUID="2d41068d-3439-4a1d-bb73-9d974c281d4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.84:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.048737 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.049408 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.049457 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.050502 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af6ec7858844e06383504664e3428c363dcae3dff9a3af262236079433a1d744"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.050577 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://af6ec7858844e06383504664e3428c363dcae3dff9a3af262236079433a1d744" gracePeriod=600 Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.604555 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="af6ec7858844e06383504664e3428c363dcae3dff9a3af262236079433a1d744" exitCode=0 Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.604748 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"af6ec7858844e06383504664e3428c363dcae3dff9a3af262236079433a1d744"} Dec 05 07:01:41 crc kubenswrapper[4865]: I1205 07:01:41.604915 4865 scope.go:117] "RemoveContainer" containerID="2dc23b345fbad75f7782861f5fa51d530b4363a634f6a30a1f168dfbe3524203" Dec 05 07:01:42 crc kubenswrapper[4865]: I1205 07:01:42.618811 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294"} Dec 05 07:03:41 crc kubenswrapper[4865]: I1205 07:03:41.049641 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:03:41 crc kubenswrapper[4865]: I1205 07:03:41.050377 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:04:11 crc kubenswrapper[4865]: I1205 07:04:11.062071 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:04:11 crc kubenswrapper[4865]: I1205 07:04:11.062878 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.048935 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.051572 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.051799 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.053093 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.053321 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" gracePeriod=600 Dec 05 07:04:41 crc kubenswrapper[4865]: E1205 07:04:41.186589 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.732285 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" exitCode=0 Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.732325 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294"} Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.732358 4865 scope.go:117] "RemoveContainer" containerID="af6ec7858844e06383504664e3428c363dcae3dff9a3af262236079433a1d744" Dec 05 07:04:41 crc kubenswrapper[4865]: I1205 07:04:41.732978 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:04:41 crc kubenswrapper[4865]: E1205 07:04:41.733203 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:04:56 crc kubenswrapper[4865]: I1205 07:04:56.006700 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:04:56 crc kubenswrapper[4865]: E1205 07:04:56.007620 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:05:09 crc kubenswrapper[4865]: I1205 07:05:09.007416 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:05:09 crc kubenswrapper[4865]: E1205 07:05:09.009259 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:05:20 crc kubenswrapper[4865]: I1205 07:05:20.006132 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:05:20 crc kubenswrapper[4865]: E1205 07:05:20.006897 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:05:34 crc kubenswrapper[4865]: I1205 07:05:34.006294 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:05:34 crc kubenswrapper[4865]: E1205 07:05:34.008291 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:05:48 crc kubenswrapper[4865]: I1205 07:05:48.006460 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:05:48 crc kubenswrapper[4865]: E1205 07:05:48.007242 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:05:59 crc kubenswrapper[4865]: I1205 07:05:59.006951 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:05:59 crc kubenswrapper[4865]: E1205 07:05:59.009421 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:06:11 crc kubenswrapper[4865]: I1205 07:06:11.013154 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:06:11 crc kubenswrapper[4865]: E1205 07:06:11.014030 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:06:26 crc kubenswrapper[4865]: I1205 07:06:26.006762 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:06:26 crc kubenswrapper[4865]: E1205 07:06:26.007703 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:06:40 crc kubenswrapper[4865]: I1205 07:06:40.007068 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:06:40 crc kubenswrapper[4865]: E1205 07:06:40.007626 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:06:50 crc kubenswrapper[4865]: I1205 07:06:50.051869 4865 generic.go:334] "Generic (PLEG): container finished" podID="564b1ff3-5b9c-4058-94b2-a488e26b27dc" containerID="3a3fcfaa61b18668b7c50b718db1a67c7783a5dceccc412ddcef5e33a56be7f1" exitCode=0 Dec 05 07:06:50 crc kubenswrapper[4865]: I1205 07:06:50.051957 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"564b1ff3-5b9c-4058-94b2-a488e26b27dc","Type":"ContainerDied","Data":"3a3fcfaa61b18668b7c50b718db1a67c7783a5dceccc412ddcef5e33a56be7f1"} Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.428646 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.612870 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ssh-key\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.612943 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ca-certs\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613001 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613056 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-config-data\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613123 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-workdir\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613166 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6tgl\" (UniqueName: \"kubernetes.io/projected/564b1ff3-5b9c-4058-94b2-a488e26b27dc-kube-api-access-c6tgl\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613198 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613242 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-temporary\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.613279 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config-secret\") pod \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\" (UID: \"564b1ff3-5b9c-4058-94b2-a488e26b27dc\") " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.615037 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-config-data" (OuterVolumeSpecName: "config-data") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.616817 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.621114 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.622461 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564b1ff3-5b9c-4058-94b2-a488e26b27dc-kube-api-access-c6tgl" (OuterVolumeSpecName: "kube-api-access-c6tgl") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "kube-api-access-c6tgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.622641 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.645111 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.668698 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.670266 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.694283 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "564b1ff3-5b9c-4058-94b2-a488e26b27dc" (UID: "564b1ff3-5b9c-4058-94b2-a488e26b27dc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715304 4865 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715338 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715349 4865 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715358 4865 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/564b1ff3-5b9c-4058-94b2-a488e26b27dc-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715390 4865 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715403 4865 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715412 4865 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/564b1ff3-5b9c-4058-94b2-a488e26b27dc-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715421 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6tgl\" (UniqueName: \"kubernetes.io/projected/564b1ff3-5b9c-4058-94b2-a488e26b27dc-kube-api-access-c6tgl\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.715430 4865 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/564b1ff3-5b9c-4058-94b2-a488e26b27dc-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.756919 4865 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 05 07:06:51 crc kubenswrapper[4865]: I1205 07:06:51.817407 4865 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 05 07:06:52 crc kubenswrapper[4865]: I1205 07:06:52.079908 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"564b1ff3-5b9c-4058-94b2-a488e26b27dc","Type":"ContainerDied","Data":"49e1a0a6ece17732af381659bcd98fa5ab102f78c1df7f8110c0fd5df50c8636"} Dec 05 07:06:52 crc kubenswrapper[4865]: I1205 07:06:52.080216 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49e1a0a6ece17732af381659bcd98fa5ab102f78c1df7f8110c0fd5df50c8636" Dec 05 07:06:52 crc kubenswrapper[4865]: I1205 07:06:52.080393 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 07:06:54 crc kubenswrapper[4865]: I1205 07:06:54.006909 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:06:54 crc kubenswrapper[4865]: E1205 07:06:54.007563 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.189653 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 07:07:00 crc kubenswrapper[4865]: E1205 07:07:00.191167 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564b1ff3-5b9c-4058-94b2-a488e26b27dc" containerName="tempest-tests-tempest-tests-runner" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.191187 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="564b1ff3-5b9c-4058-94b2-a488e26b27dc" containerName="tempest-tests-tempest-tests-runner" Dec 05 07:07:00 crc kubenswrapper[4865]: E1205 07:07:00.191233 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" containerName="keystone-cron" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.191241 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" containerName="keystone-cron" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.191469 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0" containerName="keystone-cron" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.191484 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="564b1ff3-5b9c-4058-94b2-a488e26b27dc" containerName="tempest-tests-tempest-tests-runner" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.192555 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.201147 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c9k8f" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.202469 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.284454 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88krp\" (UniqueName: \"kubernetes.io/projected/43529b69-5dae-4d58-9246-664fe5f3489e-kube-api-access-88krp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.286175 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.388153 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.388788 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88krp\" (UniqueName: \"kubernetes.io/projected/43529b69-5dae-4d58-9246-664fe5f3489e-kube-api-access-88krp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.389519 4865 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.424333 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88krp\" (UniqueName: \"kubernetes.io/projected/43529b69-5dae-4d58-9246-664fe5f3489e-kube-api-access-88krp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.431680 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"43529b69-5dae-4d58-9246-664fe5f3489e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.516786 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 07:07:00 crc kubenswrapper[4865]: I1205 07:07:00.994569 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 07:07:01 crc kubenswrapper[4865]: I1205 07:07:01.005419 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 07:07:01 crc kubenswrapper[4865]: I1205 07:07:01.198378 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"43529b69-5dae-4d58-9246-664fe5f3489e","Type":"ContainerStarted","Data":"aa07fe509d79dd5101207b5c01a6ece11991728497500213b1a35990926d2ce7"} Dec 05 07:07:02 crc kubenswrapper[4865]: I1205 07:07:02.213496 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"43529b69-5dae-4d58-9246-664fe5f3489e","Type":"ContainerStarted","Data":"2f89b54fdc9fd4df4e3438d89d157f6055b932946c06b59ae01dd438f0997a83"} Dec 05 07:07:02 crc kubenswrapper[4865]: I1205 07:07:02.245687 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.372090183 podStartE2EDuration="2.245666765s" podCreationTimestamp="2025-12-05 07:07:00 +0000 UTC" firstStartedPulling="2025-12-05 07:07:01.004914576 +0000 UTC m=+4440.284925798" lastFinishedPulling="2025-12-05 07:07:01.878491158 +0000 UTC m=+4441.158502380" observedRunningTime="2025-12-05 07:07:02.231292538 +0000 UTC m=+4441.511303760" watchObservedRunningTime="2025-12-05 07:07:02.245666765 +0000 UTC m=+4441.525677997" Dec 05 07:07:05 crc kubenswrapper[4865]: I1205 07:07:05.006677 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:07:05 crc kubenswrapper[4865]: E1205 07:07:05.007601 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:07:20 crc kubenswrapper[4865]: I1205 07:07:20.006326 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:07:20 crc kubenswrapper[4865]: E1205 07:07:20.007225 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.755022 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dj7mb/must-gather-dqxr4"] Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.757602 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.761134 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dj7mb"/"openshift-service-ca.crt" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.761337 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dj7mb"/"default-dockercfg-28dn8" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.761388 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dj7mb"/"kube-root-ca.crt" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.773159 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dj7mb/must-gather-dqxr4"] Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.880478 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22grm\" (UniqueName: \"kubernetes.io/projected/04ea41ec-f680-4a62-b48c-c08b1b13fba5-kube-api-access-22grm\") pod \"must-gather-dqxr4\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.880562 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/04ea41ec-f680-4a62-b48c-c08b1b13fba5-must-gather-output\") pod \"must-gather-dqxr4\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.982869 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22grm\" (UniqueName: \"kubernetes.io/projected/04ea41ec-f680-4a62-b48c-c08b1b13fba5-kube-api-access-22grm\") pod \"must-gather-dqxr4\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.982940 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/04ea41ec-f680-4a62-b48c-c08b1b13fba5-must-gather-output\") pod \"must-gather-dqxr4\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:25 crc kubenswrapper[4865]: I1205 07:07:25.983663 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/04ea41ec-f680-4a62-b48c-c08b1b13fba5-must-gather-output\") pod \"must-gather-dqxr4\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:26 crc kubenswrapper[4865]: I1205 07:07:26.326117 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22grm\" (UniqueName: \"kubernetes.io/projected/04ea41ec-f680-4a62-b48c-c08b1b13fba5-kube-api-access-22grm\") pod \"must-gather-dqxr4\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:26 crc kubenswrapper[4865]: I1205 07:07:26.381230 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:07:26 crc kubenswrapper[4865]: I1205 07:07:26.980742 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dj7mb/must-gather-dqxr4"] Dec 05 07:07:27 crc kubenswrapper[4865]: I1205 07:07:27.469302 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" event={"ID":"04ea41ec-f680-4a62-b48c-c08b1b13fba5","Type":"ContainerStarted","Data":"ec276f4b311b49bc61be29aef6482266dbbe757e7521405977dffd2ad7880762"} Dec 05 07:07:32 crc kubenswrapper[4865]: I1205 07:07:32.510593 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" event={"ID":"04ea41ec-f680-4a62-b48c-c08b1b13fba5","Type":"ContainerStarted","Data":"f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4"} Dec 05 07:07:32 crc kubenswrapper[4865]: I1205 07:07:32.511109 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" event={"ID":"04ea41ec-f680-4a62-b48c-c08b1b13fba5","Type":"ContainerStarted","Data":"098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43"} Dec 05 07:07:32 crc kubenswrapper[4865]: I1205 07:07:32.530858 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" podStartSLOduration=3.012277354 podStartE2EDuration="7.530814476s" podCreationTimestamp="2025-12-05 07:07:25 +0000 UTC" firstStartedPulling="2025-12-05 07:07:26.979078326 +0000 UTC m=+4466.259089548" lastFinishedPulling="2025-12-05 07:07:31.497615448 +0000 UTC m=+4470.777626670" observedRunningTime="2025-12-05 07:07:32.526432252 +0000 UTC m=+4471.806443484" watchObservedRunningTime="2025-12-05 07:07:32.530814476 +0000 UTC m=+4471.810825708" Dec 05 07:07:34 crc kubenswrapper[4865]: I1205 07:07:34.006684 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:07:34 crc kubenswrapper[4865]: E1205 07:07:34.007393 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.109584 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-cvjbz"] Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.447180 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.533298 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7nq\" (UniqueName: \"kubernetes.io/projected/31239c36-31e5-48de-bf51-e2bd6722eb50-kube-api-access-7b7nq\") pod \"crc-debug-cvjbz\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.533399 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31239c36-31e5-48de-bf51-e2bd6722eb50-host\") pod \"crc-debug-cvjbz\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.634657 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7nq\" (UniqueName: \"kubernetes.io/projected/31239c36-31e5-48de-bf51-e2bd6722eb50-kube-api-access-7b7nq\") pod \"crc-debug-cvjbz\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.634724 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31239c36-31e5-48de-bf51-e2bd6722eb50-host\") pod \"crc-debug-cvjbz\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.634983 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31239c36-31e5-48de-bf51-e2bd6722eb50-host\") pod \"crc-debug-cvjbz\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.652082 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7nq\" (UniqueName: \"kubernetes.io/projected/31239c36-31e5-48de-bf51-e2bd6722eb50-kube-api-access-7b7nq\") pod \"crc-debug-cvjbz\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:37 crc kubenswrapper[4865]: I1205 07:07:37.775936 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:07:38 crc kubenswrapper[4865]: I1205 07:07:38.580796 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" event={"ID":"31239c36-31e5-48de-bf51-e2bd6722eb50","Type":"ContainerStarted","Data":"3da7e3a998278ece863bfe730d19d9004a1d811a72313af12889a09f51abb95e"} Dec 05 07:07:47 crc kubenswrapper[4865]: I1205 07:07:47.007142 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:07:47 crc kubenswrapper[4865]: E1205 07:07:47.007972 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:07:51 crc kubenswrapper[4865]: I1205 07:07:51.696045 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" event={"ID":"31239c36-31e5-48de-bf51-e2bd6722eb50","Type":"ContainerStarted","Data":"5699c8d99a5e77ba2836c7d1354ba5176d37c683f0301c6854522ffe2099e230"} Dec 05 07:08:01 crc kubenswrapper[4865]: I1205 07:08:01.013120 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:08:01 crc kubenswrapper[4865]: E1205 07:08:01.013944 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:08:15 crc kubenswrapper[4865]: I1205 07:08:15.006439 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:08:15 crc kubenswrapper[4865]: E1205 07:08:15.007286 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:08:30 crc kubenswrapper[4865]: I1205 07:08:30.006655 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:08:30 crc kubenswrapper[4865]: E1205 07:08:30.007327 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:08:42 crc kubenswrapper[4865]: I1205 07:08:42.005996 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:08:42 crc kubenswrapper[4865]: E1205 07:08:42.006686 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:08:47 crc kubenswrapper[4865]: I1205 07:08:47.152469 4865 generic.go:334] "Generic (PLEG): container finished" podID="31239c36-31e5-48de-bf51-e2bd6722eb50" containerID="5699c8d99a5e77ba2836c7d1354ba5176d37c683f0301c6854522ffe2099e230" exitCode=0 Dec 05 07:08:47 crc kubenswrapper[4865]: I1205 07:08:47.152681 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" event={"ID":"31239c36-31e5-48de-bf51-e2bd6722eb50","Type":"ContainerDied","Data":"5699c8d99a5e77ba2836c7d1354ba5176d37c683f0301c6854522ffe2099e230"} Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.297328 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.339136 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-cvjbz"] Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.350218 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-cvjbz"] Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.448452 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7nq\" (UniqueName: \"kubernetes.io/projected/31239c36-31e5-48de-bf51-e2bd6722eb50-kube-api-access-7b7nq\") pod \"31239c36-31e5-48de-bf51-e2bd6722eb50\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.448532 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31239c36-31e5-48de-bf51-e2bd6722eb50-host\") pod \"31239c36-31e5-48de-bf51-e2bd6722eb50\" (UID: \"31239c36-31e5-48de-bf51-e2bd6722eb50\") " Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.449000 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31239c36-31e5-48de-bf51-e2bd6722eb50-host" (OuterVolumeSpecName: "host") pod "31239c36-31e5-48de-bf51-e2bd6722eb50" (UID: "31239c36-31e5-48de-bf51-e2bd6722eb50"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.455013 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31239c36-31e5-48de-bf51-e2bd6722eb50-kube-api-access-7b7nq" (OuterVolumeSpecName: "kube-api-access-7b7nq") pod "31239c36-31e5-48de-bf51-e2bd6722eb50" (UID: "31239c36-31e5-48de-bf51-e2bd6722eb50"). InnerVolumeSpecName "kube-api-access-7b7nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.552665 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7nq\" (UniqueName: \"kubernetes.io/projected/31239c36-31e5-48de-bf51-e2bd6722eb50-kube-api-access-7b7nq\") on node \"crc\" DevicePath \"\"" Dec 05 07:08:48 crc kubenswrapper[4865]: I1205 07:08:48.552698 4865 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31239c36-31e5-48de-bf51-e2bd6722eb50-host\") on node \"crc\" DevicePath \"\"" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.026358 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31239c36-31e5-48de-bf51-e2bd6722eb50" path="/var/lib/kubelet/pods/31239c36-31e5-48de-bf51-e2bd6722eb50/volumes" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.181373 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-cvjbz" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.181767 4865 scope.go:117] "RemoveContainer" containerID="5699c8d99a5e77ba2836c7d1354ba5176d37c683f0301c6854522ffe2099e230" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.513356 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-4m9mp"] Dec 05 07:08:49 crc kubenswrapper[4865]: E1205 07:08:49.514249 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31239c36-31e5-48de-bf51-e2bd6722eb50" containerName="container-00" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.514268 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="31239c36-31e5-48de-bf51-e2bd6722eb50" containerName="container-00" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.514560 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="31239c36-31e5-48de-bf51-e2bd6722eb50" containerName="container-00" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.515340 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.674528 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-host\") pod \"crc-debug-4m9mp\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.674744 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zfxb\" (UniqueName: \"kubernetes.io/projected/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-kube-api-access-9zfxb\") pod \"crc-debug-4m9mp\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.776088 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zfxb\" (UniqueName: \"kubernetes.io/projected/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-kube-api-access-9zfxb\") pod \"crc-debug-4m9mp\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.776176 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-host\") pod \"crc-debug-4m9mp\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.776312 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-host\") pod \"crc-debug-4m9mp\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.809698 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zfxb\" (UniqueName: \"kubernetes.io/projected/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-kube-api-access-9zfxb\") pod \"crc-debug-4m9mp\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:49 crc kubenswrapper[4865]: I1205 07:08:49.832154 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:50 crc kubenswrapper[4865]: I1205 07:08:50.192044 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" event={"ID":"e02a042c-ab99-4be1-ab52-25ccd13e6eb2","Type":"ContainerStarted","Data":"40e605da2e755a4d8e14cb10071693f2a258d6830f0479c3b7937d23b0e37af4"} Dec 05 07:08:50 crc kubenswrapper[4865]: I1205 07:08:50.192336 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" event={"ID":"e02a042c-ab99-4be1-ab52-25ccd13e6eb2","Type":"ContainerStarted","Data":"57435f10aef4e198cd88bd1f3c07b6615b8a9721b58e53642d20ab239c1e5d61"} Dec 05 07:08:50 crc kubenswrapper[4865]: I1205 07:08:50.213896 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" podStartSLOduration=1.213870658 podStartE2EDuration="1.213870658s" podCreationTimestamp="2025-12-05 07:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 07:08:50.205865882 +0000 UTC m=+4549.485877094" watchObservedRunningTime="2025-12-05 07:08:50.213870658 +0000 UTC m=+4549.493881890" Dec 05 07:08:51 crc kubenswrapper[4865]: I1205 07:08:51.202783 4865 generic.go:334] "Generic (PLEG): container finished" podID="e02a042c-ab99-4be1-ab52-25ccd13e6eb2" containerID="40e605da2e755a4d8e14cb10071693f2a258d6830f0479c3b7937d23b0e37af4" exitCode=0 Dec 05 07:08:51 crc kubenswrapper[4865]: I1205 07:08:51.203022 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" event={"ID":"e02a042c-ab99-4be1-ab52-25ccd13e6eb2","Type":"ContainerDied","Data":"40e605da2e755a4d8e14cb10071693f2a258d6830f0479c3b7937d23b0e37af4"} Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.315762 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.357065 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-4m9mp"] Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.367727 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-4m9mp"] Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.428705 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-host\") pod \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.428792 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zfxb\" (UniqueName: \"kubernetes.io/projected/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-kube-api-access-9zfxb\") pod \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\" (UID: \"e02a042c-ab99-4be1-ab52-25ccd13e6eb2\") " Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.428812 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-host" (OuterVolumeSpecName: "host") pod "e02a042c-ab99-4be1-ab52-25ccd13e6eb2" (UID: "e02a042c-ab99-4be1-ab52-25ccd13e6eb2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.429371 4865 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-host\") on node \"crc\" DevicePath \"\"" Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.435966 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-kube-api-access-9zfxb" (OuterVolumeSpecName: "kube-api-access-9zfxb") pod "e02a042c-ab99-4be1-ab52-25ccd13e6eb2" (UID: "e02a042c-ab99-4be1-ab52-25ccd13e6eb2"). InnerVolumeSpecName "kube-api-access-9zfxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:08:52 crc kubenswrapper[4865]: I1205 07:08:52.530922 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zfxb\" (UniqueName: \"kubernetes.io/projected/e02a042c-ab99-4be1-ab52-25ccd13e6eb2-kube-api-access-9zfxb\") on node \"crc\" DevicePath \"\"" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.017048 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02a042c-ab99-4be1-ab52-25ccd13e6eb2" path="/var/lib/kubelet/pods/e02a042c-ab99-4be1-ab52-25ccd13e6eb2/volumes" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.219125 4865 scope.go:117] "RemoveContainer" containerID="40e605da2e755a4d8e14cb10071693f2a258d6830f0479c3b7937d23b0e37af4" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.219140 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-4m9mp" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.533590 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-lb9z9"] Dec 05 07:08:53 crc kubenswrapper[4865]: E1205 07:08:53.534193 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02a042c-ab99-4be1-ab52-25ccd13e6eb2" containerName="container-00" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.534205 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02a042c-ab99-4be1-ab52-25ccd13e6eb2" containerName="container-00" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.534374 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02a042c-ab99-4be1-ab52-25ccd13e6eb2" containerName="container-00" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.534936 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.650805 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xqx6\" (UniqueName: \"kubernetes.io/projected/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-kube-api-access-4xqx6\") pod \"crc-debug-lb9z9\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.650917 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-host\") pod \"crc-debug-lb9z9\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.753088 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xqx6\" (UniqueName: \"kubernetes.io/projected/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-kube-api-access-4xqx6\") pod \"crc-debug-lb9z9\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.753180 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-host\") pod \"crc-debug-lb9z9\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.753325 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-host\") pod \"crc-debug-lb9z9\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.773928 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xqx6\" (UniqueName: \"kubernetes.io/projected/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-kube-api-access-4xqx6\") pod \"crc-debug-lb9z9\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:53 crc kubenswrapper[4865]: I1205 07:08:53.852571 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:54 crc kubenswrapper[4865]: I1205 07:08:54.007314 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:08:54 crc kubenswrapper[4865]: E1205 07:08:54.008030 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:08:54 crc kubenswrapper[4865]: I1205 07:08:54.230368 4865 generic.go:334] "Generic (PLEG): container finished" podID="5f3ba175-b037-4f1e-bc8e-1a855ca1f928" containerID="2ee58d1a5e6d608f5cc821ba384955ec078db5de4a1982f4b15bfabd0cd7bd3d" exitCode=0 Dec 05 07:08:54 crc kubenswrapper[4865]: I1205 07:08:54.230407 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" event={"ID":"5f3ba175-b037-4f1e-bc8e-1a855ca1f928","Type":"ContainerDied","Data":"2ee58d1a5e6d608f5cc821ba384955ec078db5de4a1982f4b15bfabd0cd7bd3d"} Dec 05 07:08:54 crc kubenswrapper[4865]: I1205 07:08:54.230427 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" event={"ID":"5f3ba175-b037-4f1e-bc8e-1a855ca1f928","Type":"ContainerStarted","Data":"89a6531d981abf7277fad1b9c1b8160f0ed041b1d9a19bf604a99b15d4e0880b"} Dec 05 07:08:54 crc kubenswrapper[4865]: I1205 07:08:54.272154 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-lb9z9"] Dec 05 07:08:54 crc kubenswrapper[4865]: I1205 07:08:54.280625 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dj7mb/crc-debug-lb9z9"] Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.342781 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.488360 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xqx6\" (UniqueName: \"kubernetes.io/projected/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-kube-api-access-4xqx6\") pod \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.488571 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-host\") pod \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\" (UID: \"5f3ba175-b037-4f1e-bc8e-1a855ca1f928\") " Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.489183 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-host" (OuterVolumeSpecName: "host") pod "5f3ba175-b037-4f1e-bc8e-1a855ca1f928" (UID: "5f3ba175-b037-4f1e-bc8e-1a855ca1f928"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.501310 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-kube-api-access-4xqx6" (OuterVolumeSpecName: "kube-api-access-4xqx6") pod "5f3ba175-b037-4f1e-bc8e-1a855ca1f928" (UID: "5f3ba175-b037-4f1e-bc8e-1a855ca1f928"). InnerVolumeSpecName "kube-api-access-4xqx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.590247 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xqx6\" (UniqueName: \"kubernetes.io/projected/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-kube-api-access-4xqx6\") on node \"crc\" DevicePath \"\"" Dec 05 07:08:55 crc kubenswrapper[4865]: I1205 07:08:55.590463 4865 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3ba175-b037-4f1e-bc8e-1a855ca1f928-host\") on node \"crc\" DevicePath \"\"" Dec 05 07:08:56 crc kubenswrapper[4865]: I1205 07:08:56.247701 4865 scope.go:117] "RemoveContainer" containerID="2ee58d1a5e6d608f5cc821ba384955ec078db5de4a1982f4b15bfabd0cd7bd3d" Dec 05 07:08:56 crc kubenswrapper[4865]: I1205 07:08:56.247954 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/crc-debug-lb9z9" Dec 05 07:08:57 crc kubenswrapper[4865]: I1205 07:08:57.021452 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3ba175-b037-4f1e-bc8e-1a855ca1f928" path="/var/lib/kubelet/pods/5f3ba175-b037-4f1e-bc8e-1a855ca1f928/volumes" Dec 05 07:09:05 crc kubenswrapper[4865]: I1205 07:09:05.007107 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:09:05 crc kubenswrapper[4865]: E1205 07:09:05.007814 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.496498 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hk584"] Dec 05 07:09:13 crc kubenswrapper[4865]: E1205 07:09:13.497960 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3ba175-b037-4f1e-bc8e-1a855ca1f928" containerName="container-00" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.497993 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3ba175-b037-4f1e-bc8e-1a855ca1f928" containerName="container-00" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.498467 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3ba175-b037-4f1e-bc8e-1a855ca1f928" containerName="container-00" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.501631 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.517117 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk584"] Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.666483 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-utilities\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.666933 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkxh\" (UniqueName: \"kubernetes.io/projected/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-kube-api-access-fbkxh\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.666987 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-catalog-content\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.768738 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-utilities\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.768893 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbkxh\" (UniqueName: \"kubernetes.io/projected/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-kube-api-access-fbkxh\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.769058 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-catalog-content\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.769330 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-utilities\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.769576 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-catalog-content\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.794947 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbkxh\" (UniqueName: \"kubernetes.io/projected/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-kube-api-access-fbkxh\") pod \"redhat-marketplace-hk584\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:13 crc kubenswrapper[4865]: I1205 07:09:13.836588 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:14 crc kubenswrapper[4865]: I1205 07:09:14.415392 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk584"] Dec 05 07:09:15 crc kubenswrapper[4865]: I1205 07:09:15.454234 4865 generic.go:334] "Generic (PLEG): container finished" podID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerID="f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464" exitCode=0 Dec 05 07:09:15 crc kubenswrapper[4865]: I1205 07:09:15.454365 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerDied","Data":"f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464"} Dec 05 07:09:15 crc kubenswrapper[4865]: I1205 07:09:15.455700 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerStarted","Data":"678a1c4b4570e0928327aeb17c5b1d7695aa0d3ad71edcbd773fe60ca54ae4cb"} Dec 05 07:09:16 crc kubenswrapper[4865]: I1205 07:09:16.466157 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerStarted","Data":"bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4"} Dec 05 07:09:17 crc kubenswrapper[4865]: I1205 07:09:17.476316 4865 generic.go:334] "Generic (PLEG): container finished" podID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerID="bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4" exitCode=0 Dec 05 07:09:17 crc kubenswrapper[4865]: I1205 07:09:17.476405 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerDied","Data":"bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4"} Dec 05 07:09:18 crc kubenswrapper[4865]: I1205 07:09:18.486222 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerStarted","Data":"d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c"} Dec 05 07:09:18 crc kubenswrapper[4865]: I1205 07:09:18.506162 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hk584" podStartSLOduration=3.049698742 podStartE2EDuration="5.506142782s" podCreationTimestamp="2025-12-05 07:09:13 +0000 UTC" firstStartedPulling="2025-12-05 07:09:15.457943473 +0000 UTC m=+4574.737954695" lastFinishedPulling="2025-12-05 07:09:17.914387513 +0000 UTC m=+4577.194398735" observedRunningTime="2025-12-05 07:09:18.500211904 +0000 UTC m=+4577.780223126" watchObservedRunningTime="2025-12-05 07:09:18.506142782 +0000 UTC m=+4577.786154004" Dec 05 07:09:19 crc kubenswrapper[4865]: I1205 07:09:19.713214 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c67fc55d6-grhds_115995c2-39bf-4d60-bcf9-ca342384137a/barbican-api/0.log" Dec 05 07:09:19 crc kubenswrapper[4865]: I1205 07:09:19.802003 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c67fc55d6-grhds_115995c2-39bf-4d60-bcf9-ca342384137a/barbican-api-log/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.008015 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:09:20 crc kubenswrapper[4865]: E1205 07:09:20.008219 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.019456 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6564bc679b-dbsbx_b71ad914-2c87-4cd5-94ad-ffc717f3600a/barbican-keystone-listener/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.060617 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6564bc679b-dbsbx_b71ad914-2c87-4cd5-94ad-ffc717f3600a/barbican-keystone-listener-log/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.184306 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95f789cff-nbpm9_527588f6-952d-4f9c-990c-775b34d48d78/barbican-worker/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.317530 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95f789cff-nbpm9_527588f6-952d-4f9c-990c-775b34d48d78/barbican-worker-log/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.417499 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j_ea0e7080-5e20-4b45-9896-2cda6b9e332f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.707838 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/ceilometer-central-agent/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.750548 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/proxy-httpd/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.768988 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/ceilometer-notification-agent/0.log" Dec 05 07:09:20 crc kubenswrapper[4865]: I1205 07:09:20.797924 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/sg-core/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.038277 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2d368636-72ce-46db-ab44-91489de4985f/cinder-api/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.102330 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2d368636-72ce-46db-ab44-91489de4985f/cinder-api-log/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.316776 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3d9b36dc-b4e2-4a85-ab48-63bf2318e717/cinder-scheduler/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.421387 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2652z_7ba48e1b-5d9a-436a-8250-297390ed1781/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.445756 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3d9b36dc-b4e2-4a85-ab48-63bf2318e717/probe/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.713653 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt_f01b2a46-843f-4022-ac72-af49312bbcc8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:21 crc kubenswrapper[4865]: I1205 07:09:21.790800 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55fb7f8d4c-hp4ck_1e9a22c2-0e4d-4c25-b694-e3afc4721e58/init/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.073708 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt_e0f77448-e553-45f7-90db-3a800258bdf3/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.133626 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55fb7f8d4c-hp4ck_1e9a22c2-0e4d-4c25-b694-e3afc4721e58/init/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.188143 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55fb7f8d4c-hp4ck_1e9a22c2-0e4d-4c25-b694-e3afc4721e58/dnsmasq-dns/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.648161 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bf5aac6f-3ab8-412a-92f3-6102f9b75238/glance-httpd/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.721301 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bf5aac6f-3ab8-412a-92f3-6102f9b75238/glance-log/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.869291 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e54c5fe9-12f5-40a2-a472-249d1510d49c/glance-httpd/0.log" Dec 05 07:09:22 crc kubenswrapper[4865]: I1205 07:09:22.936493 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e54c5fe9-12f5-40a2-a472-249d1510d49c/glance-log/0.log" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.107688 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78c59b79fd-5jlv4_0b2dbfc6-6978-4613-a307-d4d4b4b88bc9/horizon/1.log" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.337183 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78c59b79fd-5jlv4_0b2dbfc6-6978-4613-a307-d4d4b4b88bc9/horizon/0.log" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.547941 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-br9nk_3112d62b-5125-4614-a5c3-6a50bf1cc515/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.642386 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78c59b79fd-5jlv4_0b2dbfc6-6978-4613-a307-d4d4b4b88bc9/horizon-log/0.log" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.734687 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pdpbp_65d6bbea-eb81-4cfb-ba7c-e56d423884f8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.837098 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.837407 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.897101 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:23 crc kubenswrapper[4865]: I1205 07:09:23.943160 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415301-frs8q_20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0/keystone-cron/0.log" Dec 05 07:09:24 crc kubenswrapper[4865]: I1205 07:09:24.190815 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_df463e57-e3b9-4829-bd44-94c3ec6a90fa/kube-state-metrics/0.log" Dec 05 07:09:24 crc kubenswrapper[4865]: I1205 07:09:24.345000 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rvldt_ef2fa284-2648-4c53-8443-e60705efb609/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:24 crc kubenswrapper[4865]: I1205 07:09:24.419415 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-668bb48dd6-6gzl7_52184630-757a-4290-a4a0-380b5ffb1c76/keystone-api/0.log" Dec 05 07:09:24 crc kubenswrapper[4865]: I1205 07:09:24.655687 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:24 crc kubenswrapper[4865]: I1205 07:09:24.737809 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk584"] Dec 05 07:09:24 crc kubenswrapper[4865]: I1205 07:09:24.911079 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb_7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:25 crc kubenswrapper[4865]: I1205 07:09:25.130869 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df96fc59-crcwg_b374397b-c64c-439b-b7eb-01d2fb34f474/neutron-httpd/0.log" Dec 05 07:09:25 crc kubenswrapper[4865]: I1205 07:09:25.363890 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df96fc59-crcwg_b374397b-c64c-439b-b7eb-01d2fb34f474/neutron-api/0.log" Dec 05 07:09:25 crc kubenswrapper[4865]: I1205 07:09:25.923470 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5d470cf8-c2ca-4bc1-ab26-d8762af687d1/nova-cell0-conductor-conductor/0.log" Dec 05 07:09:26 crc kubenswrapper[4865]: I1205 07:09:26.130610 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2587c341-67da-4cfc-a5fc-44d3eeefa9a4/nova-cell1-conductor-conductor/0.log" Dec 05 07:09:26 crc kubenswrapper[4865]: I1205 07:09:26.535291 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bd86e12e-6ef3-41e5-9f84-e8d45ddaead0/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 07:09:26 crc kubenswrapper[4865]: I1205 07:09:26.612988 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hk584" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="registry-server" containerID="cri-o://d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c" gracePeriod=2 Dec 05 07:09:26 crc kubenswrapper[4865]: I1205 07:09:26.656862 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_95b08d9c-9466-4aef-b330-160d014e1e9d/nova-api-log/0.log" Dec 05 07:09:26 crc kubenswrapper[4865]: I1205 07:09:26.847045 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8pmn8_4b81cc6f-f002-4a0d-911f-2aedbec17e6c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.116750 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.170544 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_95b08d9c-9466-4aef-b330-160d014e1e9d/nova-api-api/0.log" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.179100 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_668d173f-5e28-427e-a382-f905813fc91e/nova-metadata-log/0.log" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.187308 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-utilities\") pod \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.187582 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-catalog-content\") pod \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.187671 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbkxh\" (UniqueName: \"kubernetes.io/projected/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-kube-api-access-fbkxh\") pod \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\" (UID: \"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e\") " Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.188963 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-utilities" (OuterVolumeSpecName: "utilities") pod "d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" (UID: "d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.194792 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-kube-api-access-fbkxh" (OuterVolumeSpecName: "kube-api-access-fbkxh") pod "d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" (UID: "d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e"). InnerVolumeSpecName "kube-api-access-fbkxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.226697 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" (UID: "d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.289752 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.289781 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbkxh\" (UniqueName: \"kubernetes.io/projected/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-kube-api-access-fbkxh\") on node \"crc\" DevicePath \"\"" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.289791 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.432193 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9a70babd-c8a6-442f-aa44-d013f3887c93/mysql-bootstrap/0.log" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.622939 4865 generic.go:334] "Generic (PLEG): container finished" podID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerID="d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c" exitCode=0 Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.622990 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerDied","Data":"d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c"} Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.623023 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk584" event={"ID":"d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e","Type":"ContainerDied","Data":"678a1c4b4570e0928327aeb17c5b1d7695aa0d3ad71edcbd773fe60ca54ae4cb"} Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.623045 4865 scope.go:117] "RemoveContainer" containerID="d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.623106 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk584" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.660235 4865 scope.go:117] "RemoveContainer" containerID="bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.667240 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk584"] Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.681114 4865 scope.go:117] "RemoveContainer" containerID="f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.700674 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk584"] Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.738505 4865 scope.go:117] "RemoveContainer" containerID="d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c" Dec 05 07:09:27 crc kubenswrapper[4865]: E1205 07:09:27.742171 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c\": container with ID starting with d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c not found: ID does not exist" containerID="d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.742237 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c"} err="failed to get container status \"d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c\": rpc error: code = NotFound desc = could not find container \"d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c\": container with ID starting with d9183ccd068722048b4581676dc60a9f0883613a9c31f4e141eaad6ba4d0f44c not found: ID does not exist" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.742265 4865 scope.go:117] "RemoveContainer" containerID="bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4" Dec 05 07:09:27 crc kubenswrapper[4865]: E1205 07:09:27.743106 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4\": container with ID starting with bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4 not found: ID does not exist" containerID="bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.743146 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4"} err="failed to get container status \"bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4\": rpc error: code = NotFound desc = could not find container \"bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4\": container with ID starting with bf16de56a1999e51006fcab3454e36cd53c408c163dd5463429eb24138dc6ab4 not found: ID does not exist" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.743161 4865 scope.go:117] "RemoveContainer" containerID="f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464" Dec 05 07:09:27 crc kubenswrapper[4865]: E1205 07:09:27.743594 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464\": container with ID starting with f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464 not found: ID does not exist" containerID="f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.743640 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464"} err="failed to get container status \"f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464\": rpc error: code = NotFound desc = could not find container \"f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464\": container with ID starting with f8087c3b246f9445a8d630abcec7b7d4c5080dfda30c41022fdc6d4bea7c9464 not found: ID does not exist" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.876010 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9a70babd-c8a6-442f-aa44-d013f3887c93/mysql-bootstrap/0.log" Dec 05 07:09:27 crc kubenswrapper[4865]: I1205 07:09:27.895566 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9a70babd-c8a6-442f-aa44-d013f3887c93/galera/0.log" Dec 05 07:09:28 crc kubenswrapper[4865]: I1205 07:09:28.158130 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f6fdeb31-0c08-4c87-82a4-5a51af86aa1f/nova-scheduler-scheduler/0.log" Dec 05 07:09:28 crc kubenswrapper[4865]: I1205 07:09:28.177860 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8a62a048-0ebe-4e5e-988a-4dde7746af74/mysql-bootstrap/0.log" Dec 05 07:09:28 crc kubenswrapper[4865]: I1205 07:09:28.393881 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8a62a048-0ebe-4e5e-988a-4dde7746af74/mysql-bootstrap/0.log" Dec 05 07:09:28 crc kubenswrapper[4865]: I1205 07:09:28.446802 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8a62a048-0ebe-4e5e-988a-4dde7746af74/galera/0.log" Dec 05 07:09:28 crc kubenswrapper[4865]: I1205 07:09:28.624945 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_98a93aae-b37f-4577-9567-e527f3cab3c7/openstackclient/0.log" Dec 05 07:09:28 crc kubenswrapper[4865]: I1205 07:09:28.821368 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-56dth_30eebd2b-aed6-4866-bec4-da326d89821c/ovn-controller/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.018153 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" path="/var/lib/kubelet/pods/d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e/volumes" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.035138 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_668d173f-5e28-427e-a382-f905813fc91e/nova-metadata-metadata/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.056608 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-b4c2s_034bb156-f8de-4fb1-bb44-b952c3f6a019/openstack-network-exporter/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.220898 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovsdb-server-init/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.418476 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovs-vswitchd/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.443749 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovsdb-server-init/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.481938 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovsdb-server/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.794956 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c7ebd484-c0dc-45cf-a057-46cb8f76f212/openstack-network-exporter/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.869761 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xr5z8_d6e75882-16f5-4c56-90a8-43d35503e87d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:29 crc kubenswrapper[4865]: I1205 07:09:29.904610 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c7ebd484-c0dc-45cf-a057-46cb8f76f212/ovn-northd/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.118859 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_43a7a744-bc5d-4bb1-88a2-d90afeb9fdad/openstack-network-exporter/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.128450 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_43a7a744-bc5d-4bb1-88a2-d90afeb9fdad/ovsdbserver-nb/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.267946 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5400c67c-5f55-47eb-88dc-699ecf76bc95/openstack-network-exporter/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.466978 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5400c67c-5f55-47eb-88dc-699ecf76bc95/ovsdbserver-sb/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.740726 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-699b5d9784-7n29d_44e70007-d815-432e-9cb5-bc2cc61a86fa/placement-api/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.790878 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9197b580-1cf6-4939-abfd-8dcac6a5df7e/setup-container/0.log" Dec 05 07:09:30 crc kubenswrapper[4865]: I1205 07:09:30.846434 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-699b5d9784-7n29d_44e70007-d815-432e-9cb5-bc2cc61a86fa/placement-log/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.120556 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9197b580-1cf6-4939-abfd-8dcac6a5df7e/rabbitmq/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.207657 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9197b580-1cf6-4939-abfd-8dcac6a5df7e/setup-container/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.220217 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d853a9c1-f9c9-412e-91bb-9f87123db63d/setup-container/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.641528 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d853a9c1-f9c9-412e-91bb-9f87123db63d/setup-container/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.656209 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx_4bbde20d-cc33-4f77-857e-41bb96a20fe9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.657462 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d853a9c1-f9c9-412e-91bb-9f87123db63d/rabbitmq/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.850929 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wwvsm_8555d929-3dc5-4d7c-9635-fcc096789e43/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:31 crc kubenswrapper[4865]: I1205 07:09:31.929257 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj_644fb5cf-0fad-4825-9975-46e8c5f3e1ec/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:32 crc kubenswrapper[4865]: I1205 07:09:32.154979 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sjvnw_c38b5b25-e372-4601-9b9d-6b9d883a6953/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:32 crc kubenswrapper[4865]: I1205 07:09:32.772990 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j7bbd_0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b/ssh-known-hosts-edpm-deployment/0.log" Dec 05 07:09:32 crc kubenswrapper[4865]: I1205 07:09:32.991138 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d56774dc9-sps89_5ae1380d-b481-4842-a4e5-6e96ad87b998/proxy-httpd/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.122667 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d56774dc9-sps89_5ae1380d-b481-4842-a4e5-6e96ad87b998/proxy-server/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.189966 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gx5lg_1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea/swift-ring-rebalance/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.389929 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-reaper/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.402224 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-auditor/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.511012 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-replicator/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.653742 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-auditor/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.687139 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-server/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.797209 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-replicator/0.log" Dec 05 07:09:33 crc kubenswrapper[4865]: I1205 07:09:33.797387 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-server/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.201887 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-updater/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.365293 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-auditor/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.365470 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-replicator/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.381460 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-expirer/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.442154 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-server/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.589784 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-updater/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.596992 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/swift-recon-cron/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.682591 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/rsync/0.log" Dec 05 07:09:34 crc kubenswrapper[4865]: I1205 07:09:34.984750 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7_4fe98c92-1aa9-444a-88d9-1280d7865f92/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:35 crc kubenswrapper[4865]: I1205 07:09:35.002311 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_564b1ff3-5b9c-4058-94b2-a488e26b27dc/tempest-tests-tempest-tests-runner/0.log" Dec 05 07:09:35 crc kubenswrapper[4865]: I1205 07:09:35.006462 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:09:35 crc kubenswrapper[4865]: E1205 07:09:35.006702 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:09:35 crc kubenswrapper[4865]: I1205 07:09:35.196341 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_43529b69-5dae-4d58-9246-664fe5f3489e/test-operator-logs-container/0.log" Dec 05 07:09:35 crc kubenswrapper[4865]: I1205 07:09:35.267633 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2_f3b4e4ee-2945-4c62-97e2-c561996ed302/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.298790 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-spln9"] Dec 05 07:09:44 crc kubenswrapper[4865]: E1205 07:09:44.300157 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="registry-server" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.300175 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="registry-server" Dec 05 07:09:44 crc kubenswrapper[4865]: E1205 07:09:44.300210 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="extract-content" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.300217 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="extract-content" Dec 05 07:09:44 crc kubenswrapper[4865]: E1205 07:09:44.300241 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="extract-utilities" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.300247 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="extract-utilities" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.300418 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ba4e2f-3ef9-42b8-a6ac-6261a5a4cd2e" containerName="registry-server" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.301746 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.333839 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spln9"] Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.384166 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-catalog-content\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.384223 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfj4p\" (UniqueName: \"kubernetes.io/projected/d4a118d8-4b9b-465b-bd8d-c10f135b112c-kube-api-access-jfj4p\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.384248 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-utilities\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.485521 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-catalog-content\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.485585 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfj4p\" (UniqueName: \"kubernetes.io/projected/d4a118d8-4b9b-465b-bd8d-c10f135b112c-kube-api-access-jfj4p\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.485625 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-utilities\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.486326 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-catalog-content\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.486527 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-utilities\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.514009 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfj4p\" (UniqueName: \"kubernetes.io/projected/d4a118d8-4b9b-465b-bd8d-c10f135b112c-kube-api-access-jfj4p\") pod \"community-operators-spln9\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.631059 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:44 crc kubenswrapper[4865]: I1205 07:09:44.907057 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_05035a7d-0d83-46dd-a889-3db64fb647e8/memcached/0.log" Dec 05 07:09:45 crc kubenswrapper[4865]: I1205 07:09:45.373930 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spln9"] Dec 05 07:09:45 crc kubenswrapper[4865]: I1205 07:09:45.802633 4865 generic.go:334] "Generic (PLEG): container finished" podID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerID="9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f" exitCode=0 Dec 05 07:09:45 crc kubenswrapper[4865]: I1205 07:09:45.802953 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerDied","Data":"9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f"} Dec 05 07:09:45 crc kubenswrapper[4865]: I1205 07:09:45.802985 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerStarted","Data":"26c640d52b4045560dd5b05f910e699eed83542439b08646e09237de8a5a039f"} Dec 05 07:09:46 crc kubenswrapper[4865]: I1205 07:09:46.815914 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerStarted","Data":"1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068"} Dec 05 07:09:47 crc kubenswrapper[4865]: I1205 07:09:47.006629 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:09:47 crc kubenswrapper[4865]: I1205 07:09:47.828003 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"b771b741cb44ec954a3a79410d5ce3b45d6985c5296d041b26ef7d452be8c635"} Dec 05 07:09:47 crc kubenswrapper[4865]: I1205 07:09:47.833056 4865 generic.go:334] "Generic (PLEG): container finished" podID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerID="1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068" exitCode=0 Dec 05 07:09:47 crc kubenswrapper[4865]: I1205 07:09:47.833089 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerDied","Data":"1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068"} Dec 05 07:09:49 crc kubenswrapper[4865]: I1205 07:09:49.853834 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerStarted","Data":"dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91"} Dec 05 07:09:54 crc kubenswrapper[4865]: I1205 07:09:54.631636 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:54 crc kubenswrapper[4865]: I1205 07:09:54.633502 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:54 crc kubenswrapper[4865]: I1205 07:09:54.683513 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:54 crc kubenswrapper[4865]: I1205 07:09:54.706650 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-spln9" podStartSLOduration=7.801860147 podStartE2EDuration="10.706627198s" podCreationTimestamp="2025-12-05 07:09:44 +0000 UTC" firstStartedPulling="2025-12-05 07:09:45.80479975 +0000 UTC m=+4605.084810972" lastFinishedPulling="2025-12-05 07:09:48.709566801 +0000 UTC m=+4607.989578023" observedRunningTime="2025-12-05 07:09:49.879224969 +0000 UTC m=+4609.159236191" watchObservedRunningTime="2025-12-05 07:09:54.706627198 +0000 UTC m=+4613.986638420" Dec 05 07:09:54 crc kubenswrapper[4865]: I1205 07:09:54.988666 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:55 crc kubenswrapper[4865]: I1205 07:09:55.045430 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-spln9"] Dec 05 07:09:56 crc kubenswrapper[4865]: I1205 07:09:56.952041 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-spln9" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="registry-server" containerID="cri-o://dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91" gracePeriod=2 Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.450960 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.588644 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-catalog-content\") pod \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.588704 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfj4p\" (UniqueName: \"kubernetes.io/projected/d4a118d8-4b9b-465b-bd8d-c10f135b112c-kube-api-access-jfj4p\") pod \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.588793 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-utilities\") pod \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\" (UID: \"d4a118d8-4b9b-465b-bd8d-c10f135b112c\") " Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.589908 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-utilities" (OuterVolumeSpecName: "utilities") pod "d4a118d8-4b9b-465b-bd8d-c10f135b112c" (UID: "d4a118d8-4b9b-465b-bd8d-c10f135b112c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.596171 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a118d8-4b9b-465b-bd8d-c10f135b112c-kube-api-access-jfj4p" (OuterVolumeSpecName: "kube-api-access-jfj4p") pod "d4a118d8-4b9b-465b-bd8d-c10f135b112c" (UID: "d4a118d8-4b9b-465b-bd8d-c10f135b112c"). InnerVolumeSpecName "kube-api-access-jfj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.638876 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4a118d8-4b9b-465b-bd8d-c10f135b112c" (UID: "d4a118d8-4b9b-465b-bd8d-c10f135b112c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.691247 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.691275 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a118d8-4b9b-465b-bd8d-c10f135b112c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.691286 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfj4p\" (UniqueName: \"kubernetes.io/projected/d4a118d8-4b9b-465b-bd8d-c10f135b112c-kube-api-access-jfj4p\") on node \"crc\" DevicePath \"\"" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.964207 4865 generic.go:334] "Generic (PLEG): container finished" podID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerID="dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91" exitCode=0 Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.964250 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerDied","Data":"dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91"} Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.964275 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spln9" event={"ID":"d4a118d8-4b9b-465b-bd8d-c10f135b112c","Type":"ContainerDied","Data":"26c640d52b4045560dd5b05f910e699eed83542439b08646e09237de8a5a039f"} Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.964293 4865 scope.go:117] "RemoveContainer" containerID="dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.964447 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spln9" Dec 05 07:09:57 crc kubenswrapper[4865]: I1205 07:09:57.998065 4865 scope.go:117] "RemoveContainer" containerID="1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.006622 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-spln9"] Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.017453 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-spln9"] Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.343365 4865 scope.go:117] "RemoveContainer" containerID="9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.503368 4865 scope.go:117] "RemoveContainer" containerID="dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91" Dec 05 07:09:58 crc kubenswrapper[4865]: E1205 07:09:58.503858 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91\": container with ID starting with dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91 not found: ID does not exist" containerID="dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.503900 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91"} err="failed to get container status \"dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91\": rpc error: code = NotFound desc = could not find container \"dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91\": container with ID starting with dae3c8cabf3d2736d84bd8e7cedb29870f78e92e949167b54760bbd63f4cbf91 not found: ID does not exist" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.503928 4865 scope.go:117] "RemoveContainer" containerID="1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068" Dec 05 07:09:58 crc kubenswrapper[4865]: E1205 07:09:58.504138 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068\": container with ID starting with 1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068 not found: ID does not exist" containerID="1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.504164 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068"} err="failed to get container status \"1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068\": rpc error: code = NotFound desc = could not find container \"1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068\": container with ID starting with 1c2aed5e4ad0e51a8a5f7a32dbd614a4f30996649086c1e0b2afc8d5aa9ce068 not found: ID does not exist" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.504182 4865 scope.go:117] "RemoveContainer" containerID="9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f" Dec 05 07:09:58 crc kubenswrapper[4865]: E1205 07:09:58.504357 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f\": container with ID starting with 9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f not found: ID does not exist" containerID="9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f" Dec 05 07:09:58 crc kubenswrapper[4865]: I1205 07:09:58.504381 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f"} err="failed to get container status \"9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f\": rpc error: code = NotFound desc = could not find container \"9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f\": container with ID starting with 9348032e1e58bf2fc909bfa56bb471a6747e673b5a279d51b83d98f3067d3d3f not found: ID does not exist" Dec 05 07:09:59 crc kubenswrapper[4865]: I1205 07:09:59.016014 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" path="/var/lib/kubelet/pods/d4a118d8-4b9b-465b-bd8d-c10f135b112c/volumes" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.067170 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/util/0.log" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.486814 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/util/0.log" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.546952 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/pull/0.log" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.572439 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/pull/0.log" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.832581 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/util/0.log" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.866563 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/pull/0.log" Dec 05 07:10:10 crc kubenswrapper[4865]: I1205 07:10:10.910613 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/extract/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.155937 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-7jl8d_59231c2f-740e-4c04-af17-53dab82b3497/kube-rbac-proxy/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.327793 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-7jl8d_59231c2f-740e-4c04-af17-53dab82b3497/manager/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.352680 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-97l79_f4fc5327-1468-48aa-9a51-e8be8bfb5629/kube-rbac-proxy/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.529985 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-97l79_f4fc5327-1468-48aa-9a51-e8be8bfb5629/manager/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.604442 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-j6st7_30f6dc0d-1962-42c0-a128-d7a54943d849/kube-rbac-proxy/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.706188 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-j6st7_30f6dc0d-1962-42c0-a128-d7a54943d849/manager/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.856098 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7wrx8_a44f8567-c35d-4bf4-be5c-ffbde539bb3a/kube-rbac-proxy/0.log" Dec 05 07:10:11 crc kubenswrapper[4865]: I1205 07:10:11.990871 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7wrx8_a44f8567-c35d-4bf4-be5c-ffbde539bb3a/manager/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.107935 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kkstd_87bce1fb-16c2-4c47-aa02-3f94aa681b58/manager/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.122985 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kkstd_87bce1fb-16c2-4c47-aa02-3f94aa681b58/kube-rbac-proxy/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.271788 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8zkdr_db94fe25-0c93-4471-852d-45b20c0f266c/kube-rbac-proxy/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.353248 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8zkdr_db94fe25-0c93-4471-852d-45b20c0f266c/manager/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.468281 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-758b7cbd9c-d2qcb_e13948be-6623-4815-af50-6e2b5ee807ba/kube-rbac-proxy/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.654547 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-r8f45_8d67bcae-4ae9-4545-8410-236efec0cc30/kube-rbac-proxy/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.683687 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-758b7cbd9c-d2qcb_e13948be-6623-4815-af50-6e2b5ee807ba/manager/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.745553 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-r8f45_8d67bcae-4ae9-4545-8410-236efec0cc30/manager/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.868513 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mlcgh_c3d9f2e6-7658-4f43-8d62-72bd4305c06a/kube-rbac-proxy/0.log" Dec 05 07:10:12 crc kubenswrapper[4865]: I1205 07:10:12.983717 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mlcgh_c3d9f2e6-7658-4f43-8d62-72bd4305c06a/manager/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.123508 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-vjgsh_60a54835-3802-4f32-be4f-ea7ace9084f6/kube-rbac-proxy/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.176993 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-vjgsh_60a54835-3802-4f32-be4f-ea7ace9084f6/manager/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.288786 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-cv8vc_1bad98dd-eca3-4f98-884a-655e104b2d92/kube-rbac-proxy/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.390363 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-cv8vc_1bad98dd-eca3-4f98-884a-655e104b2d92/manager/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.426770 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v25kd_2364f477-be51-4698-914a-94d0fd2dd983/kube-rbac-proxy/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.573088 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v25kd_2364f477-be51-4698-914a-94d0fd2dd983/manager/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.757154 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7nqrp_8e1c4c0e-047b-4727-9435-7192e4f48bea/kube-rbac-proxy/0.log" Dec 05 07:10:13 crc kubenswrapper[4865]: I1205 07:10:13.816970 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7nqrp_8e1c4c0e-047b-4727-9435-7192e4f48bea/manager/0.log" Dec 05 07:10:14 crc kubenswrapper[4865]: I1205 07:10:14.349015 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4546x_571eed7b-c231-42db-8acd-8f2efc828947/kube-rbac-proxy/0.log" Dec 05 07:10:14 crc kubenswrapper[4865]: I1205 07:10:14.468894 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4546x_571eed7b-c231-42db-8acd-8f2efc828947/manager/0.log" Dec 05 07:10:14 crc kubenswrapper[4865]: I1205 07:10:14.518344 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fzpfrb_2d41068d-3439-4a1d-bb73-9d974c281d4c/kube-rbac-proxy/0.log" Dec 05 07:10:14 crc kubenswrapper[4865]: I1205 07:10:14.582020 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fzpfrb_2d41068d-3439-4a1d-bb73-9d974c281d4c/manager/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.229489 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-554dbdfbd5-l48sk_7f835712-3e64-4461-89e1-4eac5548bff5/operator/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.263555 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cgngs_5f11965e-838f-4054-ad28-f25e9ba54596/registry-server/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.609313 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2jh72_51ef47f4-9d56-4555-9a53-007c8648651a/kube-rbac-proxy/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.665272 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2jh72_51ef47f4-9d56-4555-9a53-007c8648651a/manager/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.724547 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f6696b64-hqh47_0e3dd976-2c50-4721-a9a3-330c906f0e16/manager/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.827084 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-cdf4c_1caf6bc1-a2e2-4330-bc4f-1f324ec5de84/kube-rbac-proxy/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.870335 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-cdf4c_1caf6bc1-a2e2-4330-bc4f-1f324ec5de84/manager/0.log" Dec 05 07:10:15 crc kubenswrapper[4865]: I1205 07:10:15.973113 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-bg55s_c21265ee-9968-411a-9387-f0c3920b3883/operator/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.085672 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-qphvq_0445a96f-f840-45c4-a1c3-f4455c49b216/kube-rbac-proxy/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.176603 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-qphvq_0445a96f-f840-45c4-a1c3-f4455c49b216/manager/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.250699 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cppdn_1363659b-58f9-4f41-800c-863dd656d2b8/kube-rbac-proxy/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.341360 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cppdn_1363659b-58f9-4f41-800c-863dd656d2b8/manager/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.460011 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j5vmw_9b591a19-b272-4a03-8164-c0296161feb7/kube-rbac-proxy/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.468368 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j5vmw_9b591a19-b272-4a03-8164-c0296161feb7/manager/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.577356 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-zzb4b_cf1398f2-aa09-45bb-9a98-5fadca999284/kube-rbac-proxy/0.log" Dec 05 07:10:16 crc kubenswrapper[4865]: I1205 07:10:16.577640 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-zzb4b_cf1398f2-aa09-45bb-9a98-5fadca999284/manager/0.log" Dec 05 07:10:39 crc kubenswrapper[4865]: I1205 07:10:39.911781 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fmdqp_5293d191-528f-4818-b897-11bb456c2b50/control-plane-machine-set-operator/0.log" Dec 05 07:10:40 crc kubenswrapper[4865]: I1205 07:10:40.017808 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v2jhh_c0cebc10-c0ad-419c-903c-341c516f1527/kube-rbac-proxy/0.log" Dec 05 07:10:40 crc kubenswrapper[4865]: I1205 07:10:40.114143 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v2jhh_c0cebc10-c0ad-419c-903c-341c516f1527/machine-api-operator/0.log" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.272478 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nd92m"] Dec 05 07:10:48 crc kubenswrapper[4865]: E1205 07:10:48.273565 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="extract-utilities" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.273586 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="extract-utilities" Dec 05 07:10:48 crc kubenswrapper[4865]: E1205 07:10:48.273609 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="registry-server" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.273616 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="registry-server" Dec 05 07:10:48 crc kubenswrapper[4865]: E1205 07:10:48.273646 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="extract-content" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.273654 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="extract-content" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.273887 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a118d8-4b9b-465b-bd8d-c10f135b112c" containerName="registry-server" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.275651 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.304689 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nd92m"] Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.387270 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-catalog-content\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.387329 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-utilities\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.387615 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gfw\" (UniqueName: \"kubernetes.io/projected/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-kube-api-access-45gfw\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.489157 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-catalog-content\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.489204 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-utilities\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.489314 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gfw\" (UniqueName: \"kubernetes.io/projected/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-kube-api-access-45gfw\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.489744 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-catalog-content\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.489749 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-utilities\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.522080 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gfw\" (UniqueName: \"kubernetes.io/projected/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-kube-api-access-45gfw\") pod \"redhat-operators-nd92m\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:48 crc kubenswrapper[4865]: I1205 07:10:48.598444 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:49 crc kubenswrapper[4865]: I1205 07:10:49.182215 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nd92m"] Dec 05 07:10:49 crc kubenswrapper[4865]: I1205 07:10:49.443567 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerStarted","Data":"241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4"} Dec 05 07:10:49 crc kubenswrapper[4865]: I1205 07:10:49.443900 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerStarted","Data":"99b7f8b5e2fdb001d875cfe7b4e0173d52332443cc215e5ffc6420d782e9d40a"} Dec 05 07:10:50 crc kubenswrapper[4865]: I1205 07:10:50.466442 4865 generic.go:334] "Generic (PLEG): container finished" podID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerID="241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4" exitCode=0 Dec 05 07:10:50 crc kubenswrapper[4865]: I1205 07:10:50.466492 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerDied","Data":"241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4"} Dec 05 07:10:52 crc kubenswrapper[4865]: I1205 07:10:52.489046 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerStarted","Data":"d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9"} Dec 05 07:10:54 crc kubenswrapper[4865]: I1205 07:10:54.911494 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-8fhmd_1ee0a305-a19d-4053-995b-e30a57c8cc07/cert-manager-controller/0.log" Dec 05 07:10:55 crc kubenswrapper[4865]: I1205 07:10:55.098629 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nkfts_5d3a98df-9953-49ab-a722-f37837073178/cert-manager-webhook/0.log" Dec 05 07:10:55 crc kubenswrapper[4865]: I1205 07:10:55.320363 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hs4kn_b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae/cert-manager-cainjector/0.log" Dec 05 07:10:55 crc kubenswrapper[4865]: I1205 07:10:55.522559 4865 generic.go:334] "Generic (PLEG): container finished" podID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerID="d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9" exitCode=0 Dec 05 07:10:55 crc kubenswrapper[4865]: I1205 07:10:55.522641 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerDied","Data":"d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9"} Dec 05 07:10:57 crc kubenswrapper[4865]: I1205 07:10:57.541548 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerStarted","Data":"28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030"} Dec 05 07:10:57 crc kubenswrapper[4865]: I1205 07:10:57.568235 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nd92m" podStartSLOduration=3.54965491 podStartE2EDuration="9.568203475s" podCreationTimestamp="2025-12-05 07:10:48 +0000 UTC" firstStartedPulling="2025-12-05 07:10:50.47322727 +0000 UTC m=+4669.753238492" lastFinishedPulling="2025-12-05 07:10:56.491775825 +0000 UTC m=+4675.771787057" observedRunningTime="2025-12-05 07:10:57.558894662 +0000 UTC m=+4676.838905884" watchObservedRunningTime="2025-12-05 07:10:57.568203475 +0000 UTC m=+4676.848214697" Dec 05 07:10:58 crc kubenswrapper[4865]: I1205 07:10:58.598663 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:58 crc kubenswrapper[4865]: I1205 07:10:58.598734 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:10:59 crc kubenswrapper[4865]: I1205 07:10:59.656496 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nd92m" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="registry-server" probeResult="failure" output=< Dec 05 07:10:59 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 07:10:59 crc kubenswrapper[4865]: > Dec 05 07:11:09 crc kubenswrapper[4865]: I1205 07:11:09.164552 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:11:09 crc kubenswrapper[4865]: I1205 07:11:09.226478 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:11:09 crc kubenswrapper[4865]: I1205 07:11:09.406407 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nd92m"] Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.055346 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-ctjwp_47720eeb-4718-4077-8c64-8184aa08b670/nmstate-console-plugin/0.log" Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.281746 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6nk5l_4b89f3ad-3a92-467c-a613-5c567bbe8e0e/nmstate-handler/0.log" Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.316617 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wm4nq_76931d96-861b-4372-9209-98f4b296df1c/kube-rbac-proxy/0.log" Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.389466 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wm4nq_76931d96-861b-4372-9209-98f4b296df1c/nmstate-metrics/0.log" Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.600791 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-fdgdj_806823de-8d1e-48f9-964f-86cd689434c7/nmstate-operator/0.log" Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.672670 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-cpqvb_88204ee2-fa2e-4780-97c0-4ca5aa8554fe/nmstate-webhook/0.log" Dec 05 07:11:10 crc kubenswrapper[4865]: I1205 07:11:10.703437 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nd92m" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="registry-server" containerID="cri-o://28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030" gracePeriod=2 Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.180982 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.362640 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-utilities\") pod \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.363429 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-catalog-content\") pod \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.363508 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-utilities" (OuterVolumeSpecName: "utilities") pod "fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" (UID: "fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.363523 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45gfw\" (UniqueName: \"kubernetes.io/projected/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-kube-api-access-45gfw\") pod \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\" (UID: \"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3\") " Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.364501 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.369807 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-kube-api-access-45gfw" (OuterVolumeSpecName: "kube-api-access-45gfw") pod "fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" (UID: "fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3"). InnerVolumeSpecName "kube-api-access-45gfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.466347 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45gfw\" (UniqueName: \"kubernetes.io/projected/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-kube-api-access-45gfw\") on node \"crc\" DevicePath \"\"" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.483534 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" (UID: "fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.568067 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.713710 4865 generic.go:334] "Generic (PLEG): container finished" podID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerID="28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030" exitCode=0 Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.713795 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd92m" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.713810 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerDied","Data":"28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030"} Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.714342 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd92m" event={"ID":"fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3","Type":"ContainerDied","Data":"99b7f8b5e2fdb001d875cfe7b4e0173d52332443cc215e5ffc6420d782e9d40a"} Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.714370 4865 scope.go:117] "RemoveContainer" containerID="28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.735740 4865 scope.go:117] "RemoveContainer" containerID="d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.753911 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nd92m"] Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.768648 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nd92m"] Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.779129 4865 scope.go:117] "RemoveContainer" containerID="241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.806395 4865 scope.go:117] "RemoveContainer" containerID="28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030" Dec 05 07:11:11 crc kubenswrapper[4865]: E1205 07:11:11.806991 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030\": container with ID starting with 28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030 not found: ID does not exist" containerID="28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.807089 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030"} err="failed to get container status \"28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030\": rpc error: code = NotFound desc = could not find container \"28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030\": container with ID starting with 28f70162394190c94a0d3447d8e9c57b94054bf37012b4bd5552743b37b82030 not found: ID does not exist" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.807169 4865 scope.go:117] "RemoveContainer" containerID="d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9" Dec 05 07:11:11 crc kubenswrapper[4865]: E1205 07:11:11.807456 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9\": container with ID starting with d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9 not found: ID does not exist" containerID="d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.807507 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9"} err="failed to get container status \"d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9\": rpc error: code = NotFound desc = could not find container \"d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9\": container with ID starting with d510f5d084863125ca0ea1b620dedc8ed1c7681cd5a4d2d42b5d67c3ce0249a9 not found: ID does not exist" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.807542 4865 scope.go:117] "RemoveContainer" containerID="241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4" Dec 05 07:11:11 crc kubenswrapper[4865]: E1205 07:11:11.809250 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4\": container with ID starting with 241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4 not found: ID does not exist" containerID="241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4" Dec 05 07:11:11 crc kubenswrapper[4865]: I1205 07:11:11.809290 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4"} err="failed to get container status \"241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4\": rpc error: code = NotFound desc = could not find container \"241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4\": container with ID starting with 241b028239bc5f4c68a6326081cc1827a42b0ca1fea889b2f6dcdb0b46aab7c4 not found: ID does not exist" Dec 05 07:11:13 crc kubenswrapper[4865]: I1205 07:11:13.018014 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" path="/var/lib/kubelet/pods/fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3/volumes" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.454580 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-m74rz_a03656bf-d0cc-4e06-b6ce-470766d186d0/kube-rbac-proxy/0.log" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.511155 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-m74rz_a03656bf-d0cc-4e06-b6ce-470766d186d0/controller/0.log" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.666578 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.890480 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.928881 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.933996 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:11:27 crc kubenswrapper[4865]: I1205 07:11:27.949758 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.133780 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.154762 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.215774 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.220507 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.357471 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.388245 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.421082 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/controller/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.458104 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.627427 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/frr-metrics/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.722326 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/kube-rbac-proxy-frr/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.746422 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/kube-rbac-proxy/0.log" Dec 05 07:11:28 crc kubenswrapper[4865]: I1205 07:11:28.926105 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/reloader/0.log" Dec 05 07:11:29 crc kubenswrapper[4865]: I1205 07:11:29.087891 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-vhnps_048569aa-8159-43b3-9ed2-55cef99d90bb/frr-k8s-webhook-server/0.log" Dec 05 07:11:29 crc kubenswrapper[4865]: I1205 07:11:29.353038 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-565b7bc7b8-6qwxr_46237ec7-567d-47d0-9994-120d3f2039e8/manager/0.log" Dec 05 07:11:29 crc kubenswrapper[4865]: I1205 07:11:29.536584 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b4f9f77c-pdn56_d30b601d-b803-4d64-923f-b085545350ee/webhook-server/0.log" Dec 05 07:11:29 crc kubenswrapper[4865]: I1205 07:11:29.701082 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jmvs2_ce563705-9a7e-4202-a8f4-512c17a481fb/kube-rbac-proxy/0.log" Dec 05 07:11:29 crc kubenswrapper[4865]: I1205 07:11:29.811521 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/frr/0.log" Dec 05 07:11:30 crc kubenswrapper[4865]: I1205 07:11:30.231912 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jmvs2_ce563705-9a7e-4202-a8f4-512c17a481fb/speaker/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.187582 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/util/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.370589 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/pull/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.381990 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/util/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.443504 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/pull/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.609908 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/util/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.635700 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/pull/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.683376 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/extract/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.766470 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/util/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.995172 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/util/0.log" Dec 05 07:11:44 crc kubenswrapper[4865]: I1205 07:11:44.996911 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/pull/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.043220 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/pull/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.180991 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/pull/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.215702 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/util/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.231944 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/extract/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.375244 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-utilities/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.587915 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-content/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.616508 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-utilities/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.624662 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-content/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.819363 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-content/0.log" Dec 05 07:11:45 crc kubenswrapper[4865]: I1205 07:11:45.856524 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-utilities/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.080702 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-utilities/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.331453 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-content/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.343601 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-utilities/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.480154 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/registry-server/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.485539 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-content/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.651733 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-utilities/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.724390 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-content/0.log" Dec 05 07:11:46 crc kubenswrapper[4865]: I1205 07:11:46.931184 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bnwqc_688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d/marketplace-operator/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.162015 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-utilities/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.350124 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-utilities/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.380577 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-content/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.418897 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/registry-server/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.476854 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-content/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.664929 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-content/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.682081 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-utilities/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.824351 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/registry-server/0.log" Dec 05 07:11:47 crc kubenswrapper[4865]: I1205 07:11:47.940891 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-utilities/0.log" Dec 05 07:11:48 crc kubenswrapper[4865]: I1205 07:11:48.148924 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-content/0.log" Dec 05 07:11:48 crc kubenswrapper[4865]: I1205 07:11:48.149024 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-content/0.log" Dec 05 07:11:48 crc kubenswrapper[4865]: I1205 07:11:48.149052 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-utilities/0.log" Dec 05 07:11:48 crc kubenswrapper[4865]: I1205 07:11:48.321342 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-utilities/0.log" Dec 05 07:11:48 crc kubenswrapper[4865]: I1205 07:11:48.381319 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-content/0.log" Dec 05 07:11:48 crc kubenswrapper[4865]: I1205 07:11:48.858345 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/registry-server/0.log" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.710379 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n8rdz"] Dec 05 07:12:02 crc kubenswrapper[4865]: E1205 07:12:02.711232 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="extract-utilities" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.711244 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="extract-utilities" Dec 05 07:12:02 crc kubenswrapper[4865]: E1205 07:12:02.711273 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="registry-server" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.711279 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="registry-server" Dec 05 07:12:02 crc kubenswrapper[4865]: E1205 07:12:02.711287 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="extract-content" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.711294 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="extract-content" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.711495 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc0c3d7-c422-40cd-bf5a-e7b67116c0e3" containerName="registry-server" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.712962 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.726082 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n8rdz"] Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.844947 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-utilities\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.844985 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7w45\" (UniqueName: \"kubernetes.io/projected/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-kube-api-access-z7w45\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.845104 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-catalog-content\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.946846 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-catalog-content\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.946929 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-utilities\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.946959 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7w45\" (UniqueName: \"kubernetes.io/projected/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-kube-api-access-z7w45\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.947405 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-catalog-content\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.947528 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-utilities\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:02 crc kubenswrapper[4865]: I1205 07:12:02.967964 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7w45\" (UniqueName: \"kubernetes.io/projected/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-kube-api-access-z7w45\") pod \"certified-operators-n8rdz\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:03 crc kubenswrapper[4865]: I1205 07:12:03.037135 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:03 crc kubenswrapper[4865]: I1205 07:12:03.429357 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n8rdz"] Dec 05 07:12:04 crc kubenswrapper[4865]: I1205 07:12:04.214306 4865 generic.go:334] "Generic (PLEG): container finished" podID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerID="8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb" exitCode=0 Dec 05 07:12:04 crc kubenswrapper[4865]: I1205 07:12:04.214348 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerDied","Data":"8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb"} Dec 05 07:12:04 crc kubenswrapper[4865]: I1205 07:12:04.214373 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerStarted","Data":"0a67c925c706a97000defb6baea722582ef621f7f7ad2552a90df7f2437f24b3"} Dec 05 07:12:04 crc kubenswrapper[4865]: I1205 07:12:04.216629 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 07:12:05 crc kubenswrapper[4865]: I1205 07:12:05.227565 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerStarted","Data":"ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb"} Dec 05 07:12:06 crc kubenswrapper[4865]: I1205 07:12:06.238691 4865 generic.go:334] "Generic (PLEG): container finished" podID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerID="ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb" exitCode=0 Dec 05 07:12:06 crc kubenswrapper[4865]: I1205 07:12:06.238782 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerDied","Data":"ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb"} Dec 05 07:12:08 crc kubenswrapper[4865]: I1205 07:12:08.256753 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerStarted","Data":"1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058"} Dec 05 07:12:08 crc kubenswrapper[4865]: I1205 07:12:08.280328 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n8rdz" podStartSLOduration=3.661251181 podStartE2EDuration="6.28030553s" podCreationTimestamp="2025-12-05 07:12:02 +0000 UTC" firstStartedPulling="2025-12-05 07:12:04.216427889 +0000 UTC m=+4743.496439111" lastFinishedPulling="2025-12-05 07:12:06.835482238 +0000 UTC m=+4746.115493460" observedRunningTime="2025-12-05 07:12:08.273619801 +0000 UTC m=+4747.553631033" watchObservedRunningTime="2025-12-05 07:12:08.28030553 +0000 UTC m=+4747.560316752" Dec 05 07:12:11 crc kubenswrapper[4865]: I1205 07:12:11.048779 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:12:11 crc kubenswrapper[4865]: I1205 07:12:11.049221 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:12:13 crc kubenswrapper[4865]: I1205 07:12:13.037963 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:13 crc kubenswrapper[4865]: I1205 07:12:13.038539 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:13 crc kubenswrapper[4865]: I1205 07:12:13.103535 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:13 crc kubenswrapper[4865]: I1205 07:12:13.448368 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:13 crc kubenswrapper[4865]: I1205 07:12:13.511423 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n8rdz"] Dec 05 07:12:15 crc kubenswrapper[4865]: I1205 07:12:15.406694 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n8rdz" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="registry-server" containerID="cri-o://1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058" gracePeriod=2 Dec 05 07:12:15 crc kubenswrapper[4865]: I1205 07:12:15.925673 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.013565 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-utilities\") pod \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.013643 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-catalog-content\") pod \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.013817 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7w45\" (UniqueName: \"kubernetes.io/projected/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-kube-api-access-z7w45\") pod \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\" (UID: \"1fba48d1-a6de-4d99-87d6-b3d0362d60f6\") " Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.014206 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-utilities" (OuterVolumeSpecName: "utilities") pod "1fba48d1-a6de-4d99-87d6-b3d0362d60f6" (UID: "1fba48d1-a6de-4d99-87d6-b3d0362d60f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.020065 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-kube-api-access-z7w45" (OuterVolumeSpecName: "kube-api-access-z7w45") pod "1fba48d1-a6de-4d99-87d6-b3d0362d60f6" (UID: "1fba48d1-a6de-4d99-87d6-b3d0362d60f6"). InnerVolumeSpecName "kube-api-access-z7w45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.106054 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fba48d1-a6de-4d99-87d6-b3d0362d60f6" (UID: "1fba48d1-a6de-4d99-87d6-b3d0362d60f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.116029 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7w45\" (UniqueName: \"kubernetes.io/projected/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-kube-api-access-z7w45\") on node \"crc\" DevicePath \"\"" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.116074 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.116085 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fba48d1-a6de-4d99-87d6-b3d0362d60f6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.416132 4865 generic.go:334] "Generic (PLEG): container finished" podID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerID="1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058" exitCode=0 Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.416173 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerDied","Data":"1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058"} Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.416203 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8rdz" event={"ID":"1fba48d1-a6de-4d99-87d6-b3d0362d60f6","Type":"ContainerDied","Data":"0a67c925c706a97000defb6baea722582ef621f7f7ad2552a90df7f2437f24b3"} Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.416226 4865 scope.go:117] "RemoveContainer" containerID="1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.416240 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8rdz" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.453737 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n8rdz"] Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.453821 4865 scope.go:117] "RemoveContainer" containerID="ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.470120 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n8rdz"] Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.745095 4865 scope.go:117] "RemoveContainer" containerID="8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.825001 4865 scope.go:117] "RemoveContainer" containerID="1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058" Dec 05 07:12:16 crc kubenswrapper[4865]: E1205 07:12:16.829779 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058\": container with ID starting with 1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058 not found: ID does not exist" containerID="1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.829885 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058"} err="failed to get container status \"1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058\": rpc error: code = NotFound desc = could not find container \"1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058\": container with ID starting with 1689cfdff073362111681ebc9c80a9e098d547db9ae28f9dbe6c3edbb2370058 not found: ID does not exist" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.829925 4865 scope.go:117] "RemoveContainer" containerID="ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb" Dec 05 07:12:16 crc kubenswrapper[4865]: E1205 07:12:16.831927 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb\": container with ID starting with ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb not found: ID does not exist" containerID="ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.831973 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb"} err="failed to get container status \"ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb\": rpc error: code = NotFound desc = could not find container \"ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb\": container with ID starting with ee62ea4bb4d54fdac071920d1751f3d38eb0f96d1bcbb06f3bad9d07b193d5cb not found: ID does not exist" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.832046 4865 scope.go:117] "RemoveContainer" containerID="8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb" Dec 05 07:12:16 crc kubenswrapper[4865]: E1205 07:12:16.832467 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb\": container with ID starting with 8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb not found: ID does not exist" containerID="8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb" Dec 05 07:12:16 crc kubenswrapper[4865]: I1205 07:12:16.832498 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb"} err="failed to get container status \"8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb\": rpc error: code = NotFound desc = could not find container \"8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb\": container with ID starting with 8adc440463d0c9c270dccb9d2aadd926c0e5c8dd1e142746af7262042ee805bb not found: ID does not exist" Dec 05 07:12:17 crc kubenswrapper[4865]: I1205 07:12:17.016108 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" path="/var/lib/kubelet/pods/1fba48d1-a6de-4d99-87d6-b3d0362d60f6/volumes" Dec 05 07:12:41 crc kubenswrapper[4865]: I1205 07:12:41.049087 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:12:41 crc kubenswrapper[4865]: I1205 07:12:41.049581 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:13:11 crc kubenswrapper[4865]: I1205 07:13:11.049352 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:13:11 crc kubenswrapper[4865]: I1205 07:13:11.049960 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:13:11 crc kubenswrapper[4865]: I1205 07:13:11.050029 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 07:13:11 crc kubenswrapper[4865]: I1205 07:13:11.051383 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b771b741cb44ec954a3a79410d5ce3b45d6985c5296d041b26ef7d452be8c635"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 07:13:11 crc kubenswrapper[4865]: I1205 07:13:11.051482 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://b771b741cb44ec954a3a79410d5ce3b45d6985c5296d041b26ef7d452be8c635" gracePeriod=600 Dec 05 07:13:12 crc kubenswrapper[4865]: I1205 07:13:12.076730 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="b771b741cb44ec954a3a79410d5ce3b45d6985c5296d041b26ef7d452be8c635" exitCode=0 Dec 05 07:13:12 crc kubenswrapper[4865]: I1205 07:13:12.076806 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"b771b741cb44ec954a3a79410d5ce3b45d6985c5296d041b26ef7d452be8c635"} Dec 05 07:13:12 crc kubenswrapper[4865]: I1205 07:13:12.077243 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392"} Dec 05 07:13:12 crc kubenswrapper[4865]: I1205 07:13:12.077267 4865 scope.go:117] "RemoveContainer" containerID="5361153ef41db6046f1c87e5cee62764c33d1d562785e6c57c4cc3fbbf394294" Dec 05 07:14:13 crc kubenswrapper[4865]: I1205 07:14:13.696508 4865 generic.go:334] "Generic (PLEG): container finished" podID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerID="098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43" exitCode=0 Dec 05 07:14:13 crc kubenswrapper[4865]: I1205 07:14:13.696655 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" event={"ID":"04ea41ec-f680-4a62-b48c-c08b1b13fba5","Type":"ContainerDied","Data":"098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43"} Dec 05 07:14:13 crc kubenswrapper[4865]: I1205 07:14:13.698025 4865 scope.go:117] "RemoveContainer" containerID="098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43" Dec 05 07:14:14 crc kubenswrapper[4865]: I1205 07:14:14.213385 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dj7mb_must-gather-dqxr4_04ea41ec-f680-4a62-b48c-c08b1b13fba5/gather/0.log" Dec 05 07:14:22 crc kubenswrapper[4865]: I1205 07:14:22.972126 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dj7mb/must-gather-dqxr4"] Dec 05 07:14:22 crc kubenswrapper[4865]: I1205 07:14:22.974166 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="copy" containerID="cri-o://f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4" gracePeriod=2 Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:22.985848 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dj7mb/must-gather-dqxr4"] Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.419519 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dj7mb_must-gather-dqxr4_04ea41ec-f680-4a62-b48c-c08b1b13fba5/copy/0.log" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.420193 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.594620 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/04ea41ec-f680-4a62-b48c-c08b1b13fba5-must-gather-output\") pod \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.595063 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22grm\" (UniqueName: \"kubernetes.io/projected/04ea41ec-f680-4a62-b48c-c08b1b13fba5-kube-api-access-22grm\") pod \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\" (UID: \"04ea41ec-f680-4a62-b48c-c08b1b13fba5\") " Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.603381 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ea41ec-f680-4a62-b48c-c08b1b13fba5-kube-api-access-22grm" (OuterVolumeSpecName: "kube-api-access-22grm") pod "04ea41ec-f680-4a62-b48c-c08b1b13fba5" (UID: "04ea41ec-f680-4a62-b48c-c08b1b13fba5"). InnerVolumeSpecName "kube-api-access-22grm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.697450 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22grm\" (UniqueName: \"kubernetes.io/projected/04ea41ec-f680-4a62-b48c-c08b1b13fba5-kube-api-access-22grm\") on node \"crc\" DevicePath \"\"" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.779650 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ea41ec-f680-4a62-b48c-c08b1b13fba5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "04ea41ec-f680-4a62-b48c-c08b1b13fba5" (UID: "04ea41ec-f680-4a62-b48c-c08b1b13fba5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.799470 4865 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/04ea41ec-f680-4a62-b48c-c08b1b13fba5-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.804301 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dj7mb_must-gather-dqxr4_04ea41ec-f680-4a62-b48c-c08b1b13fba5/copy/0.log" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.804690 4865 generic.go:334] "Generic (PLEG): container finished" podID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerID="f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4" exitCode=143 Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.804773 4865 scope.go:117] "RemoveContainer" containerID="f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.804837 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dj7mb/must-gather-dqxr4" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.873540 4865 scope.go:117] "RemoveContainer" containerID="098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.934917 4865 scope.go:117] "RemoveContainer" containerID="f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4" Dec 05 07:14:23 crc kubenswrapper[4865]: E1205 07:14:23.937421 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4\": container with ID starting with f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4 not found: ID does not exist" containerID="f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.937455 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4"} err="failed to get container status \"f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4\": rpc error: code = NotFound desc = could not find container \"f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4\": container with ID starting with f78fef9404961b97d11cf5cffcf60c62710d9bde297c6b65a458927cf00316f4 not found: ID does not exist" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.937480 4865 scope.go:117] "RemoveContainer" containerID="098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43" Dec 05 07:14:23 crc kubenswrapper[4865]: E1205 07:14:23.939038 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43\": container with ID starting with 098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43 not found: ID does not exist" containerID="098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43" Dec 05 07:14:23 crc kubenswrapper[4865]: I1205 07:14:23.939061 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43"} err="failed to get container status \"098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43\": rpc error: code = NotFound desc = could not find container \"098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43\": container with ID starting with 098327d78b3f8f9d3e4516f264ed2509bcc14bf00dab816d17ae49045a6a1e43 not found: ID does not exist" Dec 05 07:14:25 crc kubenswrapper[4865]: I1205 07:14:25.021857 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" path="/var/lib/kubelet/pods/04ea41ec-f680-4a62-b48c-c08b1b13fba5/volumes" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.157284 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr"] Dec 05 07:15:00 crc kubenswrapper[4865]: E1205 07:15:00.158390 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="gather" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158409 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="gather" Dec 05 07:15:00 crc kubenswrapper[4865]: E1205 07:15:00.158457 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="copy" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158466 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="copy" Dec 05 07:15:00 crc kubenswrapper[4865]: E1205 07:15:00.158483 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="registry-server" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158490 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="registry-server" Dec 05 07:15:00 crc kubenswrapper[4865]: E1205 07:15:00.158525 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="extract-utilities" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158535 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="extract-utilities" Dec 05 07:15:00 crc kubenswrapper[4865]: E1205 07:15:00.158551 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="extract-content" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158559 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="extract-content" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158894 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fba48d1-a6de-4d99-87d6-b3d0362d60f6" containerName="registry-server" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158931 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="copy" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.158962 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ea41ec-f680-4a62-b48c-c08b1b13fba5" containerName="gather" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.160022 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.163535 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.167889 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.169354 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr"] Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.263374 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/179078e1-8673-484f-92aa-7f25d2d0382a-config-volume\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.263438 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/179078e1-8673-484f-92aa-7f25d2d0382a-secret-volume\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.263594 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnf6s\" (UniqueName: \"kubernetes.io/projected/179078e1-8673-484f-92aa-7f25d2d0382a-kube-api-access-bnf6s\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.365143 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/179078e1-8673-484f-92aa-7f25d2d0382a-config-volume\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.365199 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/179078e1-8673-484f-92aa-7f25d2d0382a-secret-volume\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.365269 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnf6s\" (UniqueName: \"kubernetes.io/projected/179078e1-8673-484f-92aa-7f25d2d0382a-kube-api-access-bnf6s\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.366685 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/179078e1-8673-484f-92aa-7f25d2d0382a-config-volume\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.372534 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/179078e1-8673-484f-92aa-7f25d2d0382a-secret-volume\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.387761 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnf6s\" (UniqueName: \"kubernetes.io/projected/179078e1-8673-484f-92aa-7f25d2d0382a-kube-api-access-bnf6s\") pod \"collect-profiles-29415315-grlwr\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.479312 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:00 crc kubenswrapper[4865]: I1205 07:15:00.972682 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr"] Dec 05 07:15:01 crc kubenswrapper[4865]: I1205 07:15:01.190130 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" event={"ID":"179078e1-8673-484f-92aa-7f25d2d0382a","Type":"ContainerStarted","Data":"0648c33098a75c62218c79a411955a951a7244611b45051f61cacaba397554e9"} Dec 05 07:15:01 crc kubenswrapper[4865]: I1205 07:15:01.190467 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" event={"ID":"179078e1-8673-484f-92aa-7f25d2d0382a","Type":"ContainerStarted","Data":"c2b5ab8579d8b5ea4b44ed1f5e5bb0e645a57e2b42d23acc20650ece0f11139c"} Dec 05 07:15:01 crc kubenswrapper[4865]: I1205 07:15:01.209485 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" podStartSLOduration=1.209463594 podStartE2EDuration="1.209463594s" podCreationTimestamp="2025-12-05 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 07:15:01.203956428 +0000 UTC m=+4920.483967660" watchObservedRunningTime="2025-12-05 07:15:01.209463594 +0000 UTC m=+4920.489474816" Dec 05 07:15:02 crc kubenswrapper[4865]: I1205 07:15:02.199532 4865 generic.go:334] "Generic (PLEG): container finished" podID="179078e1-8673-484f-92aa-7f25d2d0382a" containerID="0648c33098a75c62218c79a411955a951a7244611b45051f61cacaba397554e9" exitCode=0 Dec 05 07:15:02 crc kubenswrapper[4865]: I1205 07:15:02.199718 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" event={"ID":"179078e1-8673-484f-92aa-7f25d2d0382a","Type":"ContainerDied","Data":"0648c33098a75c62218c79a411955a951a7244611b45051f61cacaba397554e9"} Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.529176 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.626946 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnf6s\" (UniqueName: \"kubernetes.io/projected/179078e1-8673-484f-92aa-7f25d2d0382a-kube-api-access-bnf6s\") pod \"179078e1-8673-484f-92aa-7f25d2d0382a\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.627075 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/179078e1-8673-484f-92aa-7f25d2d0382a-config-volume\") pod \"179078e1-8673-484f-92aa-7f25d2d0382a\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.627244 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/179078e1-8673-484f-92aa-7f25d2d0382a-secret-volume\") pod \"179078e1-8673-484f-92aa-7f25d2d0382a\" (UID: \"179078e1-8673-484f-92aa-7f25d2d0382a\") " Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.627894 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/179078e1-8673-484f-92aa-7f25d2d0382a-config-volume" (OuterVolumeSpecName: "config-volume") pod "179078e1-8673-484f-92aa-7f25d2d0382a" (UID: "179078e1-8673-484f-92aa-7f25d2d0382a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.632909 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179078e1-8673-484f-92aa-7f25d2d0382a-kube-api-access-bnf6s" (OuterVolumeSpecName: "kube-api-access-bnf6s") pod "179078e1-8673-484f-92aa-7f25d2d0382a" (UID: "179078e1-8673-484f-92aa-7f25d2d0382a"). InnerVolumeSpecName "kube-api-access-bnf6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.641963 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179078e1-8673-484f-92aa-7f25d2d0382a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "179078e1-8673-484f-92aa-7f25d2d0382a" (UID: "179078e1-8673-484f-92aa-7f25d2d0382a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.732239 4865 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/179078e1-8673-484f-92aa-7f25d2d0382a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.732274 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnf6s\" (UniqueName: \"kubernetes.io/projected/179078e1-8673-484f-92aa-7f25d2d0382a-kube-api-access-bnf6s\") on node \"crc\" DevicePath \"\"" Dec 05 07:15:03 crc kubenswrapper[4865]: I1205 07:15:03.732285 4865 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/179078e1-8673-484f-92aa-7f25d2d0382a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 07:15:04 crc kubenswrapper[4865]: I1205 07:15:04.217479 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" event={"ID":"179078e1-8673-484f-92aa-7f25d2d0382a","Type":"ContainerDied","Data":"c2b5ab8579d8b5ea4b44ed1f5e5bb0e645a57e2b42d23acc20650ece0f11139c"} Dec 05 07:15:04 crc kubenswrapper[4865]: I1205 07:15:04.217522 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b5ab8579d8b5ea4b44ed1f5e5bb0e645a57e2b42d23acc20650ece0f11139c" Dec 05 07:15:04 crc kubenswrapper[4865]: I1205 07:15:04.217569 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415315-grlwr" Dec 05 07:15:04 crc kubenswrapper[4865]: I1205 07:15:04.284489 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs"] Dec 05 07:15:04 crc kubenswrapper[4865]: I1205 07:15:04.295614 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415270-6n2zs"] Dec 05 07:15:05 crc kubenswrapper[4865]: I1205 07:15:05.027953 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a87491-ec02-4ff3-be61-05e7e49f9637" path="/var/lib/kubelet/pods/03a87491-ec02-4ff3-be61-05e7e49f9637/volumes" Dec 05 07:15:11 crc kubenswrapper[4865]: I1205 07:15:11.049397 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:15:11 crc kubenswrapper[4865]: I1205 07:15:11.049973 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:15:40 crc kubenswrapper[4865]: I1205 07:15:40.552639 4865 scope.go:117] "RemoveContainer" containerID="30ea87cf3a1797e349345f4fa69f145aaf13577a9e108ca45543f4c4944b90f0" Dec 05 07:15:41 crc kubenswrapper[4865]: I1205 07:15:41.049384 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:15:41 crc kubenswrapper[4865]: I1205 07:15:41.049438 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.048759 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.049390 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.049443 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.050249 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.050308 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" gracePeriod=600 Dec 05 07:16:11 crc kubenswrapper[4865]: E1205 07:16:11.174620 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.855740 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" exitCode=0 Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.855856 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392"} Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.855924 4865 scope.go:117] "RemoveContainer" containerID="b771b741cb44ec954a3a79410d5ce3b45d6985c5296d041b26ef7d452be8c635" Dec 05 07:16:11 crc kubenswrapper[4865]: I1205 07:16:11.857180 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:16:11 crc kubenswrapper[4865]: E1205 07:16:11.857765 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:16:25 crc kubenswrapper[4865]: I1205 07:16:25.006638 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:16:25 crc kubenswrapper[4865]: E1205 07:16:25.007495 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:16:38 crc kubenswrapper[4865]: I1205 07:16:38.006814 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:16:38 crc kubenswrapper[4865]: E1205 07:16:38.007572 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:16:51 crc kubenswrapper[4865]: I1205 07:16:51.013503 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:16:51 crc kubenswrapper[4865]: E1205 07:16:51.014219 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:17:04 crc kubenswrapper[4865]: I1205 07:17:04.006372 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:17:04 crc kubenswrapper[4865]: E1205 07:17:04.007092 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:17:19 crc kubenswrapper[4865]: I1205 07:17:19.007978 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:17:19 crc kubenswrapper[4865]: E1205 07:17:19.009395 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.194776 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lbvjn/must-gather-r8gjf"] Dec 05 07:17:25 crc kubenswrapper[4865]: E1205 07:17:25.195573 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179078e1-8673-484f-92aa-7f25d2d0382a" containerName="collect-profiles" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.195587 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="179078e1-8673-484f-92aa-7f25d2d0382a" containerName="collect-profiles" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.195809 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="179078e1-8673-484f-92aa-7f25d2d0382a" containerName="collect-profiles" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.196894 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.203313 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lbvjn"/"openshift-service-ca.crt" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.204015 4865 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lbvjn"/"kube-root-ca.crt" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.205729 4865 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lbvjn"/"default-dockercfg-d2dwp" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.215343 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lbvjn/must-gather-r8gjf"] Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.321022 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrhd\" (UniqueName: \"kubernetes.io/projected/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-kube-api-access-smrhd\") pod \"must-gather-r8gjf\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.321186 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-must-gather-output\") pod \"must-gather-r8gjf\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.422856 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-must-gather-output\") pod \"must-gather-r8gjf\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.422946 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrhd\" (UniqueName: \"kubernetes.io/projected/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-kube-api-access-smrhd\") pod \"must-gather-r8gjf\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.423431 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-must-gather-output\") pod \"must-gather-r8gjf\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.446541 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrhd\" (UniqueName: \"kubernetes.io/projected/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-kube-api-access-smrhd\") pod \"must-gather-r8gjf\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:25 crc kubenswrapper[4865]: I1205 07:17:25.513240 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:17:26 crc kubenswrapper[4865]: I1205 07:17:26.232938 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lbvjn/must-gather-r8gjf"] Dec 05 07:17:26 crc kubenswrapper[4865]: I1205 07:17:26.612368 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" event={"ID":"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c","Type":"ContainerStarted","Data":"de5f8d6ae803d052135402712ef336ada74774de0acca7c448f444a5bebc4d9e"} Dec 05 07:17:26 crc kubenswrapper[4865]: I1205 07:17:26.612704 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" event={"ID":"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c","Type":"ContainerStarted","Data":"2d0174eba217d2c900b70a8c27f29a1461c767d8253f7679e62b539e768ab1f3"} Dec 05 07:17:27 crc kubenswrapper[4865]: I1205 07:17:27.621841 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" event={"ID":"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c","Type":"ContainerStarted","Data":"ff864269736f60482dd2a12ac16e741abad4eaed30d7834c226b3c3133ba7ce5"} Dec 05 07:17:27 crc kubenswrapper[4865]: I1205 07:17:27.644055 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" podStartSLOduration=2.644029887 podStartE2EDuration="2.644029887s" podCreationTimestamp="2025-12-05 07:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 07:17:27.638621124 +0000 UTC m=+5066.918632346" watchObservedRunningTime="2025-12-05 07:17:27.644029887 +0000 UTC m=+5066.924041119" Dec 05 07:17:29 crc kubenswrapper[4865]: E1205 07:17:29.505167 4865 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.147:34396->38.102.83.147:33339: write tcp 38.102.83.147:34396->38.102.83.147:33339: write: broken pipe Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.471722 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-zb6xf"] Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.473555 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.518741 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msq6s\" (UniqueName: \"kubernetes.io/projected/66a9cecb-a191-4e6c-ae82-d971d4cb743c-kube-api-access-msq6s\") pod \"crc-debug-zb6xf\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.518812 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66a9cecb-a191-4e6c-ae82-d971d4cb743c-host\") pod \"crc-debug-zb6xf\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.621387 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msq6s\" (UniqueName: \"kubernetes.io/projected/66a9cecb-a191-4e6c-ae82-d971d4cb743c-kube-api-access-msq6s\") pod \"crc-debug-zb6xf\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.621492 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66a9cecb-a191-4e6c-ae82-d971d4cb743c-host\") pod \"crc-debug-zb6xf\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.621694 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66a9cecb-a191-4e6c-ae82-d971d4cb743c-host\") pod \"crc-debug-zb6xf\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.656075 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msq6s\" (UniqueName: \"kubernetes.io/projected/66a9cecb-a191-4e6c-ae82-d971d4cb743c-kube-api-access-msq6s\") pod \"crc-debug-zb6xf\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:30 crc kubenswrapper[4865]: I1205 07:17:30.797474 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:17:31 crc kubenswrapper[4865]: I1205 07:17:31.672935 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" event={"ID":"66a9cecb-a191-4e6c-ae82-d971d4cb743c","Type":"ContainerStarted","Data":"9e93f511a5e9b539f30762189e6c1450ba33d4f309dc2b4a6d084e8a1b3cbc0b"} Dec 05 07:17:31 crc kubenswrapper[4865]: I1205 07:17:31.673368 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" event={"ID":"66a9cecb-a191-4e6c-ae82-d971d4cb743c","Type":"ContainerStarted","Data":"66795cbab6fa288f839372c31cca9a034c0f4254760637f35bce39c85246554f"} Dec 05 07:17:34 crc kubenswrapper[4865]: I1205 07:17:34.007054 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:17:34 crc kubenswrapper[4865]: E1205 07:17:34.007883 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:17:46 crc kubenswrapper[4865]: I1205 07:17:46.007288 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:17:46 crc kubenswrapper[4865]: E1205 07:17:46.008171 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:18:01 crc kubenswrapper[4865]: I1205 07:18:01.013158 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:18:01 crc kubenswrapper[4865]: E1205 07:18:01.013842 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:18:12 crc kubenswrapper[4865]: I1205 07:18:12.014896 4865 generic.go:334] "Generic (PLEG): container finished" podID="66a9cecb-a191-4e6c-ae82-d971d4cb743c" containerID="9e93f511a5e9b539f30762189e6c1450ba33d4f309dc2b4a6d084e8a1b3cbc0b" exitCode=0 Dec 05 07:18:12 crc kubenswrapper[4865]: I1205 07:18:12.014961 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" event={"ID":"66a9cecb-a191-4e6c-ae82-d971d4cb743c","Type":"ContainerDied","Data":"9e93f511a5e9b539f30762189e6c1450ba33d4f309dc2b4a6d084e8a1b3cbc0b"} Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.124739 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.164694 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-zb6xf"] Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.176649 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-zb6xf"] Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.227871 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msq6s\" (UniqueName: \"kubernetes.io/projected/66a9cecb-a191-4e6c-ae82-d971d4cb743c-kube-api-access-msq6s\") pod \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.228131 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66a9cecb-a191-4e6c-ae82-d971d4cb743c-host\") pod \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\" (UID: \"66a9cecb-a191-4e6c-ae82-d971d4cb743c\") " Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.228292 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66a9cecb-a191-4e6c-ae82-d971d4cb743c-host" (OuterVolumeSpecName: "host") pod "66a9cecb-a191-4e6c-ae82-d971d4cb743c" (UID: "66a9cecb-a191-4e6c-ae82-d971d4cb743c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.228856 4865 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66a9cecb-a191-4e6c-ae82-d971d4cb743c-host\") on node \"crc\" DevicePath \"\"" Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.234058 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a9cecb-a191-4e6c-ae82-d971d4cb743c-kube-api-access-msq6s" (OuterVolumeSpecName: "kube-api-access-msq6s") pod "66a9cecb-a191-4e6c-ae82-d971d4cb743c" (UID: "66a9cecb-a191-4e6c-ae82-d971d4cb743c"). InnerVolumeSpecName "kube-api-access-msq6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:18:13 crc kubenswrapper[4865]: I1205 07:18:13.330836 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msq6s\" (UniqueName: \"kubernetes.io/projected/66a9cecb-a191-4e6c-ae82-d971d4cb743c-kube-api-access-msq6s\") on node \"crc\" DevicePath \"\"" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.034712 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66795cbab6fa288f839372c31cca9a034c0f4254760637f35bce39c85246554f" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.034950 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-zb6xf" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.436219 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-7f6rj"] Dec 05 07:18:14 crc kubenswrapper[4865]: E1205 07:18:14.436774 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a9cecb-a191-4e6c-ae82-d971d4cb743c" containerName="container-00" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.436787 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a9cecb-a191-4e6c-ae82-d971d4cb743c" containerName="container-00" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.437479 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a9cecb-a191-4e6c-ae82-d971d4cb743c" containerName="container-00" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.438162 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.550043 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f666fb77-ba6b-4782-9aee-9a9617e101b1-host\") pod \"crc-debug-7f6rj\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.550089 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk67f\" (UniqueName: \"kubernetes.io/projected/f666fb77-ba6b-4782-9aee-9a9617e101b1-kube-api-access-vk67f\") pod \"crc-debug-7f6rj\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.651781 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f666fb77-ba6b-4782-9aee-9a9617e101b1-host\") pod \"crc-debug-7f6rj\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.651856 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk67f\" (UniqueName: \"kubernetes.io/projected/f666fb77-ba6b-4782-9aee-9a9617e101b1-kube-api-access-vk67f\") pod \"crc-debug-7f6rj\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.652192 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f666fb77-ba6b-4782-9aee-9a9617e101b1-host\") pod \"crc-debug-7f6rj\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.682156 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk67f\" (UniqueName: \"kubernetes.io/projected/f666fb77-ba6b-4782-9aee-9a9617e101b1-kube-api-access-vk67f\") pod \"crc-debug-7f6rj\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: I1205 07:18:14.758457 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:14 crc kubenswrapper[4865]: W1205 07:18:14.791705 4865 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf666fb77_ba6b_4782_9aee_9a9617e101b1.slice/crio-42ad3ddd995a4410445975a645ff38a41b64959a73ac7b2123a2ac35d1362651 WatchSource:0}: Error finding container 42ad3ddd995a4410445975a645ff38a41b64959a73ac7b2123a2ac35d1362651: Status 404 returned error can't find the container with id 42ad3ddd995a4410445975a645ff38a41b64959a73ac7b2123a2ac35d1362651 Dec 05 07:18:15 crc kubenswrapper[4865]: I1205 07:18:15.017405 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a9cecb-a191-4e6c-ae82-d971d4cb743c" path="/var/lib/kubelet/pods/66a9cecb-a191-4e6c-ae82-d971d4cb743c/volumes" Dec 05 07:18:15 crc kubenswrapper[4865]: I1205 07:18:15.043988 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" event={"ID":"f666fb77-ba6b-4782-9aee-9a9617e101b1","Type":"ContainerStarted","Data":"6864cf4da0b2f80f827acccfbe3c655b666f6ca7d41fd07f2f22b3bb41328db8"} Dec 05 07:18:15 crc kubenswrapper[4865]: I1205 07:18:15.044036 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" event={"ID":"f666fb77-ba6b-4782-9aee-9a9617e101b1","Type":"ContainerStarted","Data":"42ad3ddd995a4410445975a645ff38a41b64959a73ac7b2123a2ac35d1362651"} Dec 05 07:18:15 crc kubenswrapper[4865]: I1205 07:18:15.059331 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" podStartSLOduration=1.059311976 podStartE2EDuration="1.059311976s" podCreationTimestamp="2025-12-05 07:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 07:18:15.05700044 +0000 UTC m=+5114.337011662" watchObservedRunningTime="2025-12-05 07:18:15.059311976 +0000 UTC m=+5114.339323198" Dec 05 07:18:16 crc kubenswrapper[4865]: I1205 07:18:16.006003 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:18:16 crc kubenswrapper[4865]: E1205 07:18:16.006294 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:18:16 crc kubenswrapper[4865]: I1205 07:18:16.063199 4865 generic.go:334] "Generic (PLEG): container finished" podID="f666fb77-ba6b-4782-9aee-9a9617e101b1" containerID="6864cf4da0b2f80f827acccfbe3c655b666f6ca7d41fd07f2f22b3bb41328db8" exitCode=0 Dec 05 07:18:16 crc kubenswrapper[4865]: I1205 07:18:16.063508 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" event={"ID":"f666fb77-ba6b-4782-9aee-9a9617e101b1","Type":"ContainerDied","Data":"6864cf4da0b2f80f827acccfbe3c655b666f6ca7d41fd07f2f22b3bb41328db8"} Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.182656 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.292867 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-7f6rj"] Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.295563 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk67f\" (UniqueName: \"kubernetes.io/projected/f666fb77-ba6b-4782-9aee-9a9617e101b1-kube-api-access-vk67f\") pod \"f666fb77-ba6b-4782-9aee-9a9617e101b1\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.295619 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f666fb77-ba6b-4782-9aee-9a9617e101b1-host\") pod \"f666fb77-ba6b-4782-9aee-9a9617e101b1\" (UID: \"f666fb77-ba6b-4782-9aee-9a9617e101b1\") " Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.296210 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f666fb77-ba6b-4782-9aee-9a9617e101b1-host" (OuterVolumeSpecName: "host") pod "f666fb77-ba6b-4782-9aee-9a9617e101b1" (UID: "f666fb77-ba6b-4782-9aee-9a9617e101b1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.300042 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-7f6rj"] Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.301694 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f666fb77-ba6b-4782-9aee-9a9617e101b1-kube-api-access-vk67f" (OuterVolumeSpecName: "kube-api-access-vk67f") pod "f666fb77-ba6b-4782-9aee-9a9617e101b1" (UID: "f666fb77-ba6b-4782-9aee-9a9617e101b1"). InnerVolumeSpecName "kube-api-access-vk67f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.397987 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk67f\" (UniqueName: \"kubernetes.io/projected/f666fb77-ba6b-4782-9aee-9a9617e101b1-kube-api-access-vk67f\") on node \"crc\" DevicePath \"\"" Dec 05 07:18:17 crc kubenswrapper[4865]: I1205 07:18:17.398015 4865 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f666fb77-ba6b-4782-9aee-9a9617e101b1-host\") on node \"crc\" DevicePath \"\"" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.085210 4865 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42ad3ddd995a4410445975a645ff38a41b64959a73ac7b2123a2ac35d1362651" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.085509 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-7f6rj" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.462037 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-b4p2m"] Dec 05 07:18:18 crc kubenswrapper[4865]: E1205 07:18:18.463501 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f666fb77-ba6b-4782-9aee-9a9617e101b1" containerName="container-00" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.463604 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="f666fb77-ba6b-4782-9aee-9a9617e101b1" containerName="container-00" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.463937 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="f666fb77-ba6b-4782-9aee-9a9617e101b1" containerName="container-00" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.464605 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.619874 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvh2\" (UniqueName: \"kubernetes.io/projected/bddd5321-a53a-4218-9048-60371a23950f-kube-api-access-fcvh2\") pod \"crc-debug-b4p2m\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.619927 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bddd5321-a53a-4218-9048-60371a23950f-host\") pod \"crc-debug-b4p2m\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.722018 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvh2\" (UniqueName: \"kubernetes.io/projected/bddd5321-a53a-4218-9048-60371a23950f-kube-api-access-fcvh2\") pod \"crc-debug-b4p2m\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.722072 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bddd5321-a53a-4218-9048-60371a23950f-host\") pod \"crc-debug-b4p2m\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.722305 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bddd5321-a53a-4218-9048-60371a23950f-host\") pod \"crc-debug-b4p2m\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.740261 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvh2\" (UniqueName: \"kubernetes.io/projected/bddd5321-a53a-4218-9048-60371a23950f-kube-api-access-fcvh2\") pod \"crc-debug-b4p2m\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:18 crc kubenswrapper[4865]: I1205 07:18:18.779869 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:19 crc kubenswrapper[4865]: I1205 07:18:19.021205 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f666fb77-ba6b-4782-9aee-9a9617e101b1" path="/var/lib/kubelet/pods/f666fb77-ba6b-4782-9aee-9a9617e101b1/volumes" Dec 05 07:18:19 crc kubenswrapper[4865]: I1205 07:18:19.098146 4865 generic.go:334] "Generic (PLEG): container finished" podID="bddd5321-a53a-4218-9048-60371a23950f" containerID="4ca249cb555cef9361aa69e36f9e09731bdf3503cd1c28fb58e528b95084aded" exitCode=0 Dec 05 07:18:19 crc kubenswrapper[4865]: I1205 07:18:19.098197 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" event={"ID":"bddd5321-a53a-4218-9048-60371a23950f","Type":"ContainerDied","Data":"4ca249cb555cef9361aa69e36f9e09731bdf3503cd1c28fb58e528b95084aded"} Dec 05 07:18:19 crc kubenswrapper[4865]: I1205 07:18:19.098232 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" event={"ID":"bddd5321-a53a-4218-9048-60371a23950f","Type":"ContainerStarted","Data":"7249b75a3dbc5136f65619c4da072de09a80db9c301bb66b4618a917cb62c5d1"} Dec 05 07:18:19 crc kubenswrapper[4865]: I1205 07:18:19.150871 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-b4p2m"] Dec 05 07:18:19 crc kubenswrapper[4865]: I1205 07:18:19.159866 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lbvjn/crc-debug-b4p2m"] Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.549168 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.666912 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcvh2\" (UniqueName: \"kubernetes.io/projected/bddd5321-a53a-4218-9048-60371a23950f-kube-api-access-fcvh2\") pod \"bddd5321-a53a-4218-9048-60371a23950f\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.667002 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bddd5321-a53a-4218-9048-60371a23950f-host\") pod \"bddd5321-a53a-4218-9048-60371a23950f\" (UID: \"bddd5321-a53a-4218-9048-60371a23950f\") " Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.667138 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bddd5321-a53a-4218-9048-60371a23950f-host" (OuterVolumeSpecName: "host") pod "bddd5321-a53a-4218-9048-60371a23950f" (UID: "bddd5321-a53a-4218-9048-60371a23950f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.667521 4865 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bddd5321-a53a-4218-9048-60371a23950f-host\") on node \"crc\" DevicePath \"\"" Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.677958 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddd5321-a53a-4218-9048-60371a23950f-kube-api-access-fcvh2" (OuterVolumeSpecName: "kube-api-access-fcvh2") pod "bddd5321-a53a-4218-9048-60371a23950f" (UID: "bddd5321-a53a-4218-9048-60371a23950f"). InnerVolumeSpecName "kube-api-access-fcvh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:18:20 crc kubenswrapper[4865]: I1205 07:18:20.768721 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcvh2\" (UniqueName: \"kubernetes.io/projected/bddd5321-a53a-4218-9048-60371a23950f-kube-api-access-fcvh2\") on node \"crc\" DevicePath \"\"" Dec 05 07:18:21 crc kubenswrapper[4865]: I1205 07:18:21.018285 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bddd5321-a53a-4218-9048-60371a23950f" path="/var/lib/kubelet/pods/bddd5321-a53a-4218-9048-60371a23950f/volumes" Dec 05 07:18:21 crc kubenswrapper[4865]: I1205 07:18:21.120581 4865 scope.go:117] "RemoveContainer" containerID="4ca249cb555cef9361aa69e36f9e09731bdf3503cd1c28fb58e528b95084aded" Dec 05 07:18:21 crc kubenswrapper[4865]: I1205 07:18:21.120736 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/crc-debug-b4p2m" Dec 05 07:18:29 crc kubenswrapper[4865]: I1205 07:18:29.008860 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:18:29 crc kubenswrapper[4865]: E1205 07:18:29.009582 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:18:42 crc kubenswrapper[4865]: I1205 07:18:42.006542 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:18:42 crc kubenswrapper[4865]: E1205 07:18:42.007672 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:18:54 crc kubenswrapper[4865]: I1205 07:18:54.006246 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:18:54 crc kubenswrapper[4865]: E1205 07:18:54.006982 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.006893 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:19:06 crc kubenswrapper[4865]: E1205 07:19:06.008980 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.301570 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c67fc55d6-grhds_115995c2-39bf-4d60-bcf9-ca342384137a/barbican-api/0.log" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.418265 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c67fc55d6-grhds_115995c2-39bf-4d60-bcf9-ca342384137a/barbican-api-log/0.log" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.477564 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6564bc679b-dbsbx_b71ad914-2c87-4cd5-94ad-ffc717f3600a/barbican-keystone-listener/0.log" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.626261 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6564bc679b-dbsbx_b71ad914-2c87-4cd5-94ad-ffc717f3600a/barbican-keystone-listener-log/0.log" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.724508 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95f789cff-nbpm9_527588f6-952d-4f9c-990c-775b34d48d78/barbican-worker/0.log" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.811405 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95f789cff-nbpm9_527588f6-952d-4f9c-990c-775b34d48d78/barbican-worker-log/0.log" Dec 05 07:19:06 crc kubenswrapper[4865]: I1205 07:19:06.940504 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pkw6j_ea0e7080-5e20-4b45-9896-2cda6b9e332f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.075663 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/ceilometer-central-agent/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.151330 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/ceilometer-notification-agent/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.208399 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/proxy-httpd/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.265505 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_335ba680-a368-498b-8356-ef03d2c5cfb1/sg-core/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.412082 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2d368636-72ce-46db-ab44-91489de4985f/cinder-api/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.457800 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2d368636-72ce-46db-ab44-91489de4985f/cinder-api-log/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.697302 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3d9b36dc-b4e2-4a85-ab48-63bf2318e717/cinder-scheduler/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.731615 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3d9b36dc-b4e2-4a85-ab48-63bf2318e717/probe/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.804945 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2652z_7ba48e1b-5d9a-436a-8250-297390ed1781/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:07 crc kubenswrapper[4865]: I1205 07:19:07.972957 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-v4ftt_f01b2a46-843f-4022-ac72-af49312bbcc8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:08 crc kubenswrapper[4865]: I1205 07:19:08.273714 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55fb7f8d4c-hp4ck_1e9a22c2-0e4d-4c25-b694-e3afc4721e58/init/0.log" Dec 05 07:19:08 crc kubenswrapper[4865]: I1205 07:19:08.488177 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55fb7f8d4c-hp4ck_1e9a22c2-0e4d-4c25-b694-e3afc4721e58/init/0.log" Dec 05 07:19:08 crc kubenswrapper[4865]: I1205 07:19:08.559142 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ctdzt_e0f77448-e553-45f7-90db-3a800258bdf3/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:08 crc kubenswrapper[4865]: I1205 07:19:08.674976 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55fb7f8d4c-hp4ck_1e9a22c2-0e4d-4c25-b694-e3afc4721e58/dnsmasq-dns/0.log" Dec 05 07:19:08 crc kubenswrapper[4865]: I1205 07:19:08.831708 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bf5aac6f-3ab8-412a-92f3-6102f9b75238/glance-log/0.log" Dec 05 07:19:08 crc kubenswrapper[4865]: I1205 07:19:08.862100 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bf5aac6f-3ab8-412a-92f3-6102f9b75238/glance-httpd/0.log" Dec 05 07:19:09 crc kubenswrapper[4865]: I1205 07:19:09.029431 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e54c5fe9-12f5-40a2-a472-249d1510d49c/glance-httpd/0.log" Dec 05 07:19:09 crc kubenswrapper[4865]: I1205 07:19:09.060343 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e54c5fe9-12f5-40a2-a472-249d1510d49c/glance-log/0.log" Dec 05 07:19:09 crc kubenswrapper[4865]: I1205 07:19:09.341332 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78c59b79fd-5jlv4_0b2dbfc6-6978-4613-a307-d4d4b4b88bc9/horizon/1.log" Dec 05 07:19:09 crc kubenswrapper[4865]: I1205 07:19:09.364740 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78c59b79fd-5jlv4_0b2dbfc6-6978-4613-a307-d4d4b4b88bc9/horizon/0.log" Dec 05 07:19:09 crc kubenswrapper[4865]: I1205 07:19:09.844906 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78c59b79fd-5jlv4_0b2dbfc6-6978-4613-a307-d4d4b4b88bc9/horizon-log/0.log" Dec 05 07:19:09 crc kubenswrapper[4865]: I1205 07:19:09.976559 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-br9nk_3112d62b-5125-4614-a5c3-6a50bf1cc515/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:10 crc kubenswrapper[4865]: I1205 07:19:10.012903 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pdpbp_65d6bbea-eb81-4cfb-ba7c-e56d423884f8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:10 crc kubenswrapper[4865]: I1205 07:19:10.296159 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415301-frs8q_20d71d16-5ffb-4f98-8aeb-8ecceb0db1d0/keystone-cron/0.log" Dec 05 07:19:10 crc kubenswrapper[4865]: I1205 07:19:10.577793 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_df463e57-e3b9-4829-bd44-94c3ec6a90fa/kube-state-metrics/0.log" Dec 05 07:19:10 crc kubenswrapper[4865]: I1205 07:19:10.666671 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rvldt_ef2fa284-2648-4c53-8443-e60705efb609/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:10 crc kubenswrapper[4865]: I1205 07:19:10.749287 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-668bb48dd6-6gzl7_52184630-757a-4290-a4a0-380b5ffb1c76/keystone-api/0.log" Dec 05 07:19:11 crc kubenswrapper[4865]: I1205 07:19:11.177637 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjsb_7073b1ac-84a6-4dc4-9ccb-4e8b711a34e9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:11 crc kubenswrapper[4865]: I1205 07:19:11.433197 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df96fc59-crcwg_b374397b-c64c-439b-b7eb-01d2fb34f474/neutron-httpd/0.log" Dec 05 07:19:11 crc kubenswrapper[4865]: I1205 07:19:11.543897 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df96fc59-crcwg_b374397b-c64c-439b-b7eb-01d2fb34f474/neutron-api/0.log" Dec 05 07:19:12 crc kubenswrapper[4865]: I1205 07:19:12.354670 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5d470cf8-c2ca-4bc1-ab26-d8762af687d1/nova-cell0-conductor-conductor/0.log" Dec 05 07:19:12 crc kubenswrapper[4865]: I1205 07:19:12.456727 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2587c341-67da-4cfc-a5fc-44d3eeefa9a4/nova-cell1-conductor-conductor/0.log" Dec 05 07:19:12 crc kubenswrapper[4865]: I1205 07:19:12.993601 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_95b08d9c-9466-4aef-b330-160d014e1e9d/nova-api-log/0.log" Dec 05 07:19:12 crc kubenswrapper[4865]: I1205 07:19:12.996247 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bd86e12e-6ef3-41e5-9f84-e8d45ddaead0/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 07:19:13 crc kubenswrapper[4865]: I1205 07:19:13.191159 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8pmn8_4b81cc6f-f002-4a0d-911f-2aedbec17e6c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:13 crc kubenswrapper[4865]: I1205 07:19:13.379631 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_668d173f-5e28-427e-a382-f905813fc91e/nova-metadata-log/0.log" Dec 05 07:19:13 crc kubenswrapper[4865]: I1205 07:19:13.429149 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_95b08d9c-9466-4aef-b330-160d014e1e9d/nova-api-api/0.log" Dec 05 07:19:13 crc kubenswrapper[4865]: I1205 07:19:13.894209 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9a70babd-c8a6-442f-aa44-d013f3887c93/mysql-bootstrap/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.168507 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9a70babd-c8a6-442f-aa44-d013f3887c93/galera/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.197077 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9a70babd-c8a6-442f-aa44-d013f3887c93/mysql-bootstrap/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.286327 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f6fdeb31-0c08-4c87-82a4-5a51af86aa1f/nova-scheduler-scheduler/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.493767 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8a62a048-0ebe-4e5e-988a-4dde7746af74/mysql-bootstrap/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.769412 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8a62a048-0ebe-4e5e-988a-4dde7746af74/mysql-bootstrap/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.792885 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8a62a048-0ebe-4e5e-988a-4dde7746af74/galera/0.log" Dec 05 07:19:14 crc kubenswrapper[4865]: I1205 07:19:14.973410 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_98a93aae-b37f-4577-9567-e527f3cab3c7/openstackclient/0.log" Dec 05 07:19:15 crc kubenswrapper[4865]: I1205 07:19:15.063866 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-56dth_30eebd2b-aed6-4866-bec4-da326d89821c/ovn-controller/0.log" Dec 05 07:19:15 crc kubenswrapper[4865]: I1205 07:19:15.295265 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-b4c2s_034bb156-f8de-4fb1-bb44-b952c3f6a019/openstack-network-exporter/0.log" Dec 05 07:19:15 crc kubenswrapper[4865]: I1205 07:19:15.518875 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovsdb-server-init/0.log" Dec 05 07:19:15 crc kubenswrapper[4865]: I1205 07:19:15.707594 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_668d173f-5e28-427e-a382-f905813fc91e/nova-metadata-metadata/0.log" Dec 05 07:19:15 crc kubenswrapper[4865]: I1205 07:19:15.760460 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovsdb-server-init/0.log" Dec 05 07:19:15 crc kubenswrapper[4865]: I1205 07:19:15.894073 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovsdb-server/0.log" Dec 05 07:19:16 crc kubenswrapper[4865]: I1205 07:19:16.125488 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c7ebd484-c0dc-45cf-a057-46cb8f76f212/openstack-network-exporter/0.log" Dec 05 07:19:16 crc kubenswrapper[4865]: I1205 07:19:16.956014 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbvmz_b92328b7-456b-45ce-8416-765f465ac793/ovs-vswitchd/0.log" Dec 05 07:19:17 crc kubenswrapper[4865]: I1205 07:19:17.011882 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:19:17 crc kubenswrapper[4865]: E1205 07:19:17.012127 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:19:17 crc kubenswrapper[4865]: I1205 07:19:17.114307 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xr5z8_d6e75882-16f5-4c56-90a8-43d35503e87d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:17 crc kubenswrapper[4865]: I1205 07:19:17.592944 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c7ebd484-c0dc-45cf-a057-46cb8f76f212/ovn-northd/0.log" Dec 05 07:19:17 crc kubenswrapper[4865]: I1205 07:19:17.650101 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_43a7a744-bc5d-4bb1-88a2-d90afeb9fdad/ovsdbserver-nb/0.log" Dec 05 07:19:17 crc kubenswrapper[4865]: I1205 07:19:17.704462 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_43a7a744-bc5d-4bb1-88a2-d90afeb9fdad/openstack-network-exporter/0.log" Dec 05 07:19:18 crc kubenswrapper[4865]: I1205 07:19:18.155109 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5400c67c-5f55-47eb-88dc-699ecf76bc95/openstack-network-exporter/0.log" Dec 05 07:19:18 crc kubenswrapper[4865]: I1205 07:19:18.296676 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5400c67c-5f55-47eb-88dc-699ecf76bc95/ovsdbserver-sb/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.036437 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9197b580-1cf6-4939-abfd-8dcac6a5df7e/setup-container/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.080650 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-699b5d9784-7n29d_44e70007-d815-432e-9cb5-bc2cc61a86fa/placement-api/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.131615 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-699b5d9784-7n29d_44e70007-d815-432e-9cb5-bc2cc61a86fa/placement-log/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.291511 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9197b580-1cf6-4939-abfd-8dcac6a5df7e/setup-container/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.291983 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9197b580-1cf6-4939-abfd-8dcac6a5df7e/rabbitmq/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.566376 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d853a9c1-f9c9-412e-91bb-9f87123db63d/setup-container/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.673217 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d853a9c1-f9c9-412e-91bb-9f87123db63d/setup-container/0.log" Dec 05 07:19:19 crc kubenswrapper[4865]: I1205 07:19:19.748675 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d853a9c1-f9c9-412e-91bb-9f87123db63d/rabbitmq/0.log" Dec 05 07:19:20 crc kubenswrapper[4865]: I1205 07:19:20.002640 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-52fbx_4bbde20d-cc33-4f77-857e-41bb96a20fe9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:20 crc kubenswrapper[4865]: I1205 07:19:20.117444 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wwvsm_8555d929-3dc5-4d7c-9635-fcc096789e43/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:20 crc kubenswrapper[4865]: I1205 07:19:20.257889 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nzpvj_644fb5cf-0fad-4825-9975-46e8c5f3e1ec/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:20 crc kubenswrapper[4865]: I1205 07:19:20.390870 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sjvnw_c38b5b25-e372-4601-9b9d-6b9d883a6953/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:20 crc kubenswrapper[4865]: I1205 07:19:20.607479 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j7bbd_0c8aa8a3-8378-4a97-af3f-2b59ad1d2a0b/ssh-known-hosts-edpm-deployment/0.log" Dec 05 07:19:20 crc kubenswrapper[4865]: I1205 07:19:20.944298 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d56774dc9-sps89_5ae1380d-b481-4842-a4e5-6e96ad87b998/proxy-server/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.084018 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gx5lg_1ce0946d-3001-4b6d-b8c9-ea92f1cb44ea/swift-ring-rebalance/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.110858 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d56774dc9-sps89_5ae1380d-b481-4842-a4e5-6e96ad87b998/proxy-httpd/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.345520 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-reaper/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.381700 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-auditor/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.524518 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-replicator/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.531286 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/account-server/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.633680 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-auditor/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.853006 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-server/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.856382 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-replicator/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.879597 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/container-updater/0.log" Dec 05 07:19:21 crc kubenswrapper[4865]: I1205 07:19:21.966722 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-auditor/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.123000 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-expirer/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.184860 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-server/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.194066 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-replicator/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.247267 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/object-updater/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.696224 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/rsync/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.802160 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_30fcadfd-ece2-43f2-8bdf-2de6a6b4f19f/swift-recon-cron/0.log" Dec 05 07:19:22 crc kubenswrapper[4865]: I1205 07:19:22.957441 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-h6pp7_4fe98c92-1aa9-444a-88d9-1280d7865f92/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:23 crc kubenswrapper[4865]: I1205 07:19:23.171662 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_564b1ff3-5b9c-4058-94b2-a488e26b27dc/tempest-tests-tempest-tests-runner/0.log" Dec 05 07:19:23 crc kubenswrapper[4865]: I1205 07:19:23.265579 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_43529b69-5dae-4d58-9246-664fe5f3489e/test-operator-logs-container/0.log" Dec 05 07:19:23 crc kubenswrapper[4865]: I1205 07:19:23.439389 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h8xc2_f3b4e4ee-2945-4c62-97e2-c561996ed302/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 07:19:28 crc kubenswrapper[4865]: I1205 07:19:28.006118 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:19:28 crc kubenswrapper[4865]: E1205 07:19:28.006853 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:19:33 crc kubenswrapper[4865]: I1205 07:19:33.265076 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_05035a7d-0d83-46dd-a889-3db64fb647e8/memcached/0.log" Dec 05 07:19:40 crc kubenswrapper[4865]: I1205 07:19:40.007327 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:19:40 crc kubenswrapper[4865]: E1205 07:19:40.008065 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.507679 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-52hcw"] Dec 05 07:19:45 crc kubenswrapper[4865]: E1205 07:19:45.515443 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddd5321-a53a-4218-9048-60371a23950f" containerName="container-00" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.515482 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddd5321-a53a-4218-9048-60371a23950f" containerName="container-00" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.515764 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="bddd5321-a53a-4218-9048-60371a23950f" containerName="container-00" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.517235 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.523055 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52hcw"] Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.620855 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-catalog-content\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.620959 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvj2r\" (UniqueName: \"kubernetes.io/projected/73549ea4-0de8-41af-bcbd-25275a5e23a4-kube-api-access-pvj2r\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.621020 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-utilities\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.722873 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-catalog-content\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.722938 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvj2r\" (UniqueName: \"kubernetes.io/projected/73549ea4-0de8-41af-bcbd-25275a5e23a4-kube-api-access-pvj2r\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.722996 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-utilities\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.723598 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-utilities\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.723597 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-catalog-content\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.743469 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvj2r\" (UniqueName: \"kubernetes.io/projected/73549ea4-0de8-41af-bcbd-25275a5e23a4-kube-api-access-pvj2r\") pod \"community-operators-52hcw\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:45 crc kubenswrapper[4865]: I1205 07:19:45.842114 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:46 crc kubenswrapper[4865]: I1205 07:19:46.388049 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52hcw"] Dec 05 07:19:47 crc kubenswrapper[4865]: I1205 07:19:47.260959 4865 generic.go:334] "Generic (PLEG): container finished" podID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerID="8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a" exitCode=0 Dec 05 07:19:47 crc kubenswrapper[4865]: I1205 07:19:47.261257 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerDied","Data":"8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a"} Dec 05 07:19:47 crc kubenswrapper[4865]: I1205 07:19:47.261283 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerStarted","Data":"8e3654053baa8612ba8e85abd8fca33329b9315386345588bfc0ad53a8c4db9c"} Dec 05 07:19:47 crc kubenswrapper[4865]: I1205 07:19:47.263361 4865 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 07:19:48 crc kubenswrapper[4865]: I1205 07:19:48.272644 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerStarted","Data":"5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37"} Dec 05 07:19:49 crc kubenswrapper[4865]: I1205 07:19:49.288700 4865 generic.go:334] "Generic (PLEG): container finished" podID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerID="5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37" exitCode=0 Dec 05 07:19:49 crc kubenswrapper[4865]: I1205 07:19:49.288768 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerDied","Data":"5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37"} Dec 05 07:19:50 crc kubenswrapper[4865]: I1205 07:19:50.307570 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerStarted","Data":"758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345"} Dec 05 07:19:50 crc kubenswrapper[4865]: I1205 07:19:50.329207 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-52hcw" podStartSLOduration=2.698324942 podStartE2EDuration="5.329186467s" podCreationTimestamp="2025-12-05 07:19:45 +0000 UTC" firstStartedPulling="2025-12-05 07:19:47.263049664 +0000 UTC m=+5206.543060886" lastFinishedPulling="2025-12-05 07:19:49.893911189 +0000 UTC m=+5209.173922411" observedRunningTime="2025-12-05 07:19:50.322859608 +0000 UTC m=+5209.602870830" watchObservedRunningTime="2025-12-05 07:19:50.329186467 +0000 UTC m=+5209.609197689" Dec 05 07:19:53 crc kubenswrapper[4865]: I1205 07:19:53.007065 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:19:53 crc kubenswrapper[4865]: E1205 07:19:53.007626 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:19:55 crc kubenswrapper[4865]: I1205 07:19:55.843707 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:55 crc kubenswrapper[4865]: I1205 07:19:55.844320 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:55 crc kubenswrapper[4865]: I1205 07:19:55.899876 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:56 crc kubenswrapper[4865]: I1205 07:19:56.533169 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:56 crc kubenswrapper[4865]: I1205 07:19:56.594126 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52hcw"] Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.158891 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/util/0.log" Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.369457 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-52hcw" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="registry-server" containerID="cri-o://758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345" gracePeriod=2 Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.406844 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/pull/0.log" Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.462248 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/util/0.log" Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.471391 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/pull/0.log" Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.760625 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/pull/0.log" Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.774233 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/util/0.log" Dec 05 07:19:58 crc kubenswrapper[4865]: I1205 07:19:58.828358 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7ee1904b0b36c5ac2b910f2b6a74ab6f6a37cc37d391b42d90c2aa632ac6tc9_afe32c73-a754-43f9-bc45-1ec0219469d9/extract/0.log" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.354574 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.382019 4865 generic.go:334] "Generic (PLEG): container finished" podID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerID="758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345" exitCode=0 Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.382069 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerDied","Data":"758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345"} Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.382091 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52hcw" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.382108 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52hcw" event={"ID":"73549ea4-0de8-41af-bcbd-25275a5e23a4","Type":"ContainerDied","Data":"8e3654053baa8612ba8e85abd8fca33329b9315386345588bfc0ad53a8c4db9c"} Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.382131 4865 scope.go:117] "RemoveContainer" containerID="758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.425155 4865 scope.go:117] "RemoveContainer" containerID="5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.461537 4865 scope.go:117] "RemoveContainer" containerID="8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.473388 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-7jl8d_59231c2f-740e-4c04-af17-53dab82b3497/kube-rbac-proxy/0.log" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.511042 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-catalog-content\") pod \"73549ea4-0de8-41af-bcbd-25275a5e23a4\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.511134 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-utilities\") pod \"73549ea4-0de8-41af-bcbd-25275a5e23a4\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.511308 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvj2r\" (UniqueName: \"kubernetes.io/projected/73549ea4-0de8-41af-bcbd-25275a5e23a4-kube-api-access-pvj2r\") pod \"73549ea4-0de8-41af-bcbd-25275a5e23a4\" (UID: \"73549ea4-0de8-41af-bcbd-25275a5e23a4\") " Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.519004 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73549ea4-0de8-41af-bcbd-25275a5e23a4-kube-api-access-pvj2r" (OuterVolumeSpecName: "kube-api-access-pvj2r") pod "73549ea4-0de8-41af-bcbd-25275a5e23a4" (UID: "73549ea4-0de8-41af-bcbd-25275a5e23a4"). InnerVolumeSpecName "kube-api-access-pvj2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.519553 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-utilities" (OuterVolumeSpecName: "utilities") pod "73549ea4-0de8-41af-bcbd-25275a5e23a4" (UID: "73549ea4-0de8-41af-bcbd-25275a5e23a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.551278 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-7jl8d_59231c2f-740e-4c04-af17-53dab82b3497/manager/0.log" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.559958 4865 scope.go:117] "RemoveContainer" containerID="758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345" Dec 05 07:19:59 crc kubenswrapper[4865]: E1205 07:19:59.561619 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345\": container with ID starting with 758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345 not found: ID does not exist" containerID="758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.561668 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345"} err="failed to get container status \"758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345\": rpc error: code = NotFound desc = could not find container \"758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345\": container with ID starting with 758590f9609c8c45c1af6f9b1efbe64222702d07bc68b3a21528510d47e1f345 not found: ID does not exist" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.561697 4865 scope.go:117] "RemoveContainer" containerID="5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37" Dec 05 07:19:59 crc kubenswrapper[4865]: E1205 07:19:59.564953 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37\": container with ID starting with 5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37 not found: ID does not exist" containerID="5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.565002 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37"} err="failed to get container status \"5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37\": rpc error: code = NotFound desc = could not find container \"5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37\": container with ID starting with 5f76dee1b116ffff5c9c80831cc6b46c38f4d3c9c5c415e85fde1305c990af37 not found: ID does not exist" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.565033 4865 scope.go:117] "RemoveContainer" containerID="8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a" Dec 05 07:19:59 crc kubenswrapper[4865]: E1205 07:19:59.566885 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a\": container with ID starting with 8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a not found: ID does not exist" containerID="8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.566925 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a"} err="failed to get container status \"8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a\": rpc error: code = NotFound desc = could not find container \"8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a\": container with ID starting with 8397cdd32c115bc5d67d0a153a3b9c8e9e211a6f5f7bd679a420d2921523af6a not found: ID does not exist" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.586786 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73549ea4-0de8-41af-bcbd-25275a5e23a4" (UID: "73549ea4-0de8-41af-bcbd-25275a5e23a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.614058 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.614089 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvj2r\" (UniqueName: \"kubernetes.io/projected/73549ea4-0de8-41af-bcbd-25275a5e23a4-kube-api-access-pvj2r\") on node \"crc\" DevicePath \"\"" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.614101 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73549ea4-0de8-41af-bcbd-25275a5e23a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.614647 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-97l79_f4fc5327-1468-48aa-9a51-e8be8bfb5629/kube-rbac-proxy/0.log" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.716271 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52hcw"] Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.731300 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-52hcw"] Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.804565 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-97l79_f4fc5327-1468-48aa-9a51-e8be8bfb5629/manager/0.log" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.910789 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-j6st7_30f6dc0d-1962-42c0-a128-d7a54943d849/kube-rbac-proxy/0.log" Dec 05 07:19:59 crc kubenswrapper[4865]: I1205 07:19:59.937845 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-j6st7_30f6dc0d-1962-42c0-a128-d7a54943d849/manager/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.058085 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7wrx8_a44f8567-c35d-4bf4-be5c-ffbde539bb3a/kube-rbac-proxy/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.254344 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7wrx8_a44f8567-c35d-4bf4-be5c-ffbde539bb3a/manager/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.365438 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kkstd_87bce1fb-16c2-4c47-aa02-3f94aa681b58/kube-rbac-proxy/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.388529 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kkstd_87bce1fb-16c2-4c47-aa02-3f94aa681b58/manager/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.550986 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8zkdr_db94fe25-0c93-4471-852d-45b20c0f266c/kube-rbac-proxy/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.619843 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8zkdr_db94fe25-0c93-4471-852d-45b20c0f266c/manager/0.log" Dec 05 07:20:00 crc kubenswrapper[4865]: I1205 07:20:00.767723 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-758b7cbd9c-d2qcb_e13948be-6623-4815-af50-6e2b5ee807ba/kube-rbac-proxy/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.025080 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" path="/var/lib/kubelet/pods/73549ea4-0de8-41af-bcbd-25275a5e23a4/volumes" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.052122 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-r8f45_8d67bcae-4ae9-4545-8410-236efec0cc30/kube-rbac-proxy/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.099153 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-758b7cbd9c-d2qcb_e13948be-6623-4815-af50-6e2b5ee807ba/manager/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.127024 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-r8f45_8d67bcae-4ae9-4545-8410-236efec0cc30/manager/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.646059 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mlcgh_c3d9f2e6-7658-4f43-8d62-72bd4305c06a/kube-rbac-proxy/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.795547 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-vjgsh_60a54835-3802-4f32-be4f-ea7ace9084f6/kube-rbac-proxy/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.883266 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mlcgh_c3d9f2e6-7658-4f43-8d62-72bd4305c06a/manager/0.log" Dec 05 07:20:01 crc kubenswrapper[4865]: I1205 07:20:01.989033 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-vjgsh_60a54835-3802-4f32-be4f-ea7ace9084f6/manager/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.058595 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-cv8vc_1bad98dd-eca3-4f98-884a-655e104b2d92/kube-rbac-proxy/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.157422 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-cv8vc_1bad98dd-eca3-4f98-884a-655e104b2d92/manager/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.293301 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v25kd_2364f477-be51-4698-914a-94d0fd2dd983/kube-rbac-proxy/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.376527 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v25kd_2364f477-be51-4698-914a-94d0fd2dd983/manager/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.508563 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7nqrp_8e1c4c0e-047b-4727-9435-7192e4f48bea/kube-rbac-proxy/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.615307 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7nqrp_8e1c4c0e-047b-4727-9435-7192e4f48bea/manager/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.715245 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4546x_571eed7b-c231-42db-8acd-8f2efc828947/manager/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.723002 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4546x_571eed7b-c231-42db-8acd-8f2efc828947/kube-rbac-proxy/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.850747 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fzpfrb_2d41068d-3439-4a1d-bb73-9d974c281d4c/manager/0.log" Dec 05 07:20:02 crc kubenswrapper[4865]: I1205 07:20:02.888465 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fzpfrb_2d41068d-3439-4a1d-bb73-9d974c281d4c/kube-rbac-proxy/0.log" Dec 05 07:20:03 crc kubenswrapper[4865]: I1205 07:20:03.363870 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cgngs_5f11965e-838f-4054-ad28-f25e9ba54596/registry-server/0.log" Dec 05 07:20:03 crc kubenswrapper[4865]: I1205 07:20:03.417783 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-554dbdfbd5-l48sk_7f835712-3e64-4461-89e1-4eac5548bff5/operator/0.log" Dec 05 07:20:03 crc kubenswrapper[4865]: I1205 07:20:03.534157 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2jh72_51ef47f4-9d56-4555-9a53-007c8648651a/kube-rbac-proxy/0.log" Dec 05 07:20:03 crc kubenswrapper[4865]: I1205 07:20:03.785609 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2jh72_51ef47f4-9d56-4555-9a53-007c8648651a/manager/0.log" Dec 05 07:20:03 crc kubenswrapper[4865]: I1205 07:20:03.875061 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-cdf4c_1caf6bc1-a2e2-4330-bc4f-1f324ec5de84/kube-rbac-proxy/0.log" Dec 05 07:20:03 crc kubenswrapper[4865]: I1205 07:20:03.927147 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-cdf4c_1caf6bc1-a2e2-4330-bc4f-1f324ec5de84/manager/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.087329 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f6696b64-hqh47_0e3dd976-2c50-4721-a9a3-330c906f0e16/manager/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.213351 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-bg55s_c21265ee-9968-411a-9387-f0c3920b3883/operator/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.220902 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-qphvq_0445a96f-f840-45c4-a1c3-f4455c49b216/kube-rbac-proxy/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.331812 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-qphvq_0445a96f-f840-45c4-a1c3-f4455c49b216/manager/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.359979 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cppdn_1363659b-58f9-4f41-800c-863dd656d2b8/kube-rbac-proxy/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.513654 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cppdn_1363659b-58f9-4f41-800c-863dd656d2b8/manager/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.638306 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j5vmw_9b591a19-b272-4a03-8164-c0296161feb7/kube-rbac-proxy/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.699945 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j5vmw_9b591a19-b272-4a03-8164-c0296161feb7/manager/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.774748 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-zzb4b_cf1398f2-aa09-45bb-9a98-5fadca999284/kube-rbac-proxy/0.log" Dec 05 07:20:04 crc kubenswrapper[4865]: I1205 07:20:04.792277 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-zzb4b_cf1398f2-aa09-45bb-9a98-5fadca999284/manager/0.log" Dec 05 07:20:08 crc kubenswrapper[4865]: I1205 07:20:08.006917 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:20:08 crc kubenswrapper[4865]: E1205 07:20:08.007592 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.307721 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dh2cp"] Dec 05 07:20:13 crc kubenswrapper[4865]: E1205 07:20:13.308875 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="extract-utilities" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.308895 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="extract-utilities" Dec 05 07:20:13 crc kubenswrapper[4865]: E1205 07:20:13.308924 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="extract-content" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.308932 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="extract-content" Dec 05 07:20:13 crc kubenswrapper[4865]: E1205 07:20:13.308958 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="registry-server" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.308966 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="registry-server" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.309198 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="73549ea4-0de8-41af-bcbd-25275a5e23a4" containerName="registry-server" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.310550 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.344031 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh2cp"] Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.464866 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4scs\" (UniqueName: \"kubernetes.io/projected/8e8bca27-0835-42cf-af59-c9b2238ebeb2-kube-api-access-m4scs\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.464925 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-catalog-content\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.465061 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-utilities\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.566769 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4scs\" (UniqueName: \"kubernetes.io/projected/8e8bca27-0835-42cf-af59-c9b2238ebeb2-kube-api-access-m4scs\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.566861 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-catalog-content\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.566903 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-utilities\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.567334 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-catalog-content\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.567382 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-utilities\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.601280 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4scs\" (UniqueName: \"kubernetes.io/projected/8e8bca27-0835-42cf-af59-c9b2238ebeb2-kube-api-access-m4scs\") pod \"redhat-marketplace-dh2cp\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:13 crc kubenswrapper[4865]: I1205 07:20:13.635170 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:14 crc kubenswrapper[4865]: I1205 07:20:14.203009 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh2cp"] Dec 05 07:20:14 crc kubenswrapper[4865]: I1205 07:20:14.578744 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerDied","Data":"cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836"} Dec 05 07:20:14 crc kubenswrapper[4865]: I1205 07:20:14.576950 4865 generic.go:334] "Generic (PLEG): container finished" podID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerID="cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836" exitCode=0 Dec 05 07:20:14 crc kubenswrapper[4865]: I1205 07:20:14.580533 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerStarted","Data":"075907146f0649e23cc1e64d2d8350cc1b6be4e35ed5a9fc69e2bf91e035065b"} Dec 05 07:20:16 crc kubenswrapper[4865]: I1205 07:20:16.787411 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerStarted","Data":"0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd"} Dec 05 07:20:17 crc kubenswrapper[4865]: I1205 07:20:17.800699 4865 generic.go:334] "Generic (PLEG): container finished" podID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerID="0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd" exitCode=0 Dec 05 07:20:17 crc kubenswrapper[4865]: I1205 07:20:17.800823 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerDied","Data":"0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd"} Dec 05 07:20:18 crc kubenswrapper[4865]: I1205 07:20:18.812927 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerStarted","Data":"6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa"} Dec 05 07:20:18 crc kubenswrapper[4865]: I1205 07:20:18.832043 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dh2cp" podStartSLOduration=2.157341898 podStartE2EDuration="5.832022421s" podCreationTimestamp="2025-12-05 07:20:13 +0000 UTC" firstStartedPulling="2025-12-05 07:20:14.580728493 +0000 UTC m=+5233.860739715" lastFinishedPulling="2025-12-05 07:20:18.255409006 +0000 UTC m=+5237.535420238" observedRunningTime="2025-12-05 07:20:18.829926032 +0000 UTC m=+5238.109937264" watchObservedRunningTime="2025-12-05 07:20:18.832022421 +0000 UTC m=+5238.112033643" Dec 05 07:20:23 crc kubenswrapper[4865]: I1205 07:20:23.007252 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:20:23 crc kubenswrapper[4865]: E1205 07:20:23.008142 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:20:23 crc kubenswrapper[4865]: I1205 07:20:23.636110 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:23 crc kubenswrapper[4865]: I1205 07:20:23.636168 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:23 crc kubenswrapper[4865]: I1205 07:20:23.712320 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:23 crc kubenswrapper[4865]: I1205 07:20:23.902912 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:23 crc kubenswrapper[4865]: I1205 07:20:23.956667 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh2cp"] Dec 05 07:20:25 crc kubenswrapper[4865]: I1205 07:20:25.872125 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dh2cp" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="registry-server" containerID="cri-o://6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa" gracePeriod=2 Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.314424 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.461555 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-catalog-content\") pod \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.461697 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-utilities\") pod \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.461801 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4scs\" (UniqueName: \"kubernetes.io/projected/8e8bca27-0835-42cf-af59-c9b2238ebeb2-kube-api-access-m4scs\") pod \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\" (UID: \"8e8bca27-0835-42cf-af59-c9b2238ebeb2\") " Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.464048 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-utilities" (OuterVolumeSpecName: "utilities") pod "8e8bca27-0835-42cf-af59-c9b2238ebeb2" (UID: "8e8bca27-0835-42cf-af59-c9b2238ebeb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.479227 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8bca27-0835-42cf-af59-c9b2238ebeb2-kube-api-access-m4scs" (OuterVolumeSpecName: "kube-api-access-m4scs") pod "8e8bca27-0835-42cf-af59-c9b2238ebeb2" (UID: "8e8bca27-0835-42cf-af59-c9b2238ebeb2"). InnerVolumeSpecName "kube-api-access-m4scs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.488386 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e8bca27-0835-42cf-af59-c9b2238ebeb2" (UID: "8e8bca27-0835-42cf-af59-c9b2238ebeb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.564435 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.564499 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4scs\" (UniqueName: \"kubernetes.io/projected/8e8bca27-0835-42cf-af59-c9b2238ebeb2-kube-api-access-m4scs\") on node \"crc\" DevicePath \"\"" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.564512 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8bca27-0835-42cf-af59-c9b2238ebeb2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.892470 4865 generic.go:334] "Generic (PLEG): container finished" podID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerID="6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa" exitCode=0 Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.892524 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerDied","Data":"6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa"} Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.892568 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh2cp" event={"ID":"8e8bca27-0835-42cf-af59-c9b2238ebeb2","Type":"ContainerDied","Data":"075907146f0649e23cc1e64d2d8350cc1b6be4e35ed5a9fc69e2bf91e035065b"} Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.892592 4865 scope.go:117] "RemoveContainer" containerID="6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.893366 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh2cp" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.917603 4865 scope.go:117] "RemoveContainer" containerID="0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd" Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.936193 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh2cp"] Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.959395 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh2cp"] Dec 05 07:20:26 crc kubenswrapper[4865]: I1205 07:20:26.981950 4865 scope.go:117] "RemoveContainer" containerID="cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.013494 4865 scope.go:117] "RemoveContainer" containerID="6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa" Dec 05 07:20:27 crc kubenswrapper[4865]: E1205 07:20:27.016180 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa\": container with ID starting with 6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa not found: ID does not exist" containerID="6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.016305 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa"} err="failed to get container status \"6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa\": rpc error: code = NotFound desc = could not find container \"6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa\": container with ID starting with 6d086a5bdcd4066277dc888e024f8f5a956e2c9714fc897c34cfdc1a6f6426fa not found: ID does not exist" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.016384 4865 scope.go:117] "RemoveContainer" containerID="0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd" Dec 05 07:20:27 crc kubenswrapper[4865]: E1205 07:20:27.016741 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd\": container with ID starting with 0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd not found: ID does not exist" containerID="0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.016845 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd"} err="failed to get container status \"0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd\": rpc error: code = NotFound desc = could not find container \"0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd\": container with ID starting with 0d6476a55d4d954f8f31230e7d13190385f739434f1e61f7c3c6a980eca616bd not found: ID does not exist" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.016938 4865 scope.go:117] "RemoveContainer" containerID="cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836" Dec 05 07:20:27 crc kubenswrapper[4865]: E1205 07:20:27.017291 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836\": container with ID starting with cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836 not found: ID does not exist" containerID="cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.017367 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836"} err="failed to get container status \"cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836\": rpc error: code = NotFound desc = could not find container \"cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836\": container with ID starting with cc1b7401dcceeb04190d28a816c5e0d7f333aad559fddc302b5fcdf459ea5836 not found: ID does not exist" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.021644 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" path="/var/lib/kubelet/pods/8e8bca27-0835-42cf-af59-c9b2238ebeb2/volumes" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.605075 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fmdqp_5293d191-528f-4818-b897-11bb456c2b50/control-plane-machine-set-operator/0.log" Dec 05 07:20:27 crc kubenswrapper[4865]: I1205 07:20:27.718142 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v2jhh_c0cebc10-c0ad-419c-903c-341c516f1527/kube-rbac-proxy/0.log" Dec 05 07:20:28 crc kubenswrapper[4865]: I1205 07:20:28.066779 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v2jhh_c0cebc10-c0ad-419c-903c-341c516f1527/machine-api-operator/0.log" Dec 05 07:20:35 crc kubenswrapper[4865]: I1205 07:20:35.006605 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:20:35 crc kubenswrapper[4865]: E1205 07:20:35.008528 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:20:41 crc kubenswrapper[4865]: I1205 07:20:41.553402 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-8fhmd_1ee0a305-a19d-4053-995b-e30a57c8cc07/cert-manager-controller/0.log" Dec 05 07:20:41 crc kubenswrapper[4865]: I1205 07:20:41.765040 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nkfts_5d3a98df-9953-49ab-a722-f37837073178/cert-manager-webhook/0.log" Dec 05 07:20:41 crc kubenswrapper[4865]: I1205 07:20:41.918727 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hs4kn_b2f0ff72-fd51-40eb-94c9-7dea7e5c48ae/cert-manager-cainjector/0.log" Dec 05 07:20:48 crc kubenswrapper[4865]: I1205 07:20:48.006310 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:20:48 crc kubenswrapper[4865]: E1205 07:20:48.006988 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:20:56 crc kubenswrapper[4865]: I1205 07:20:56.456321 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-ctjwp_47720eeb-4718-4077-8c64-8184aa08b670/nmstate-console-plugin/0.log" Dec 05 07:20:56 crc kubenswrapper[4865]: I1205 07:20:56.696332 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6nk5l_4b89f3ad-3a92-467c-a613-5c567bbe8e0e/nmstate-handler/0.log" Dec 05 07:20:56 crc kubenswrapper[4865]: I1205 07:20:56.788618 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wm4nq_76931d96-861b-4372-9209-98f4b296df1c/kube-rbac-proxy/0.log" Dec 05 07:20:56 crc kubenswrapper[4865]: I1205 07:20:56.865764 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wm4nq_76931d96-861b-4372-9209-98f4b296df1c/nmstate-metrics/0.log" Dec 05 07:20:56 crc kubenswrapper[4865]: I1205 07:20:56.942253 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-fdgdj_806823de-8d1e-48f9-964f-86cd689434c7/nmstate-operator/0.log" Dec 05 07:20:57 crc kubenswrapper[4865]: I1205 07:20:57.084939 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-cpqvb_88204ee2-fa2e-4780-97c0-4ca5aa8554fe/nmstate-webhook/0.log" Dec 05 07:21:03 crc kubenswrapper[4865]: I1205 07:21:03.006918 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:21:03 crc kubenswrapper[4865]: E1205 07:21:03.007508 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:21:14 crc kubenswrapper[4865]: I1205 07:21:14.646388 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-m74rz_a03656bf-d0cc-4e06-b6ce-470766d186d0/kube-rbac-proxy/0.log" Dec 05 07:21:14 crc kubenswrapper[4865]: I1205 07:21:14.699662 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-m74rz_a03656bf-d0cc-4e06-b6ce-470766d186d0/controller/0.log" Dec 05 07:21:14 crc kubenswrapper[4865]: I1205 07:21:14.884220 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.032104 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.051252 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.058429 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.125767 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.345503 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.388657 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.434877 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.450413 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.711748 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/controller/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.719548 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-reloader/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.820390 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-frr-files/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.833757 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/cp-metrics/0.log" Dec 05 07:21:15 crc kubenswrapper[4865]: I1205 07:21:15.946916 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/frr-metrics/0.log" Dec 05 07:21:16 crc kubenswrapper[4865]: I1205 07:21:16.155796 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/kube-rbac-proxy-frr/0.log" Dec 05 07:21:16 crc kubenswrapper[4865]: I1205 07:21:16.160009 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/kube-rbac-proxy/0.log" Dec 05 07:21:16 crc kubenswrapper[4865]: I1205 07:21:16.242395 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/reloader/0.log" Dec 05 07:21:16 crc kubenswrapper[4865]: I1205 07:21:16.564877 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-vhnps_048569aa-8159-43b3-9ed2-55cef99d90bb/frr-k8s-webhook-server/0.log" Dec 05 07:21:17 crc kubenswrapper[4865]: I1205 07:21:17.375323 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:21:17 crc kubenswrapper[4865]: I1205 07:21:17.388786 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-565b7bc7b8-6qwxr_46237ec7-567d-47d0-9994-120d3f2039e8/manager/0.log" Dec 05 07:21:17 crc kubenswrapper[4865]: I1205 07:21:17.607753 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b4f9f77c-pdn56_d30b601d-b803-4d64-923f-b085545350ee/webhook-server/0.log" Dec 05 07:21:17 crc kubenswrapper[4865]: I1205 07:21:17.816084 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jmvs2_ce563705-9a7e-4202-a8f4-512c17a481fb/kube-rbac-proxy/0.log" Dec 05 07:21:17 crc kubenswrapper[4865]: I1205 07:21:17.836403 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjpk7_0f46c5b9-45e6-4002-a5e8-e07ecf828a80/frr/0.log" Dec 05 07:21:18 crc kubenswrapper[4865]: I1205 07:21:18.231785 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jmvs2_ce563705-9a7e-4202-a8f4-512c17a481fb/speaker/0.log" Dec 05 07:21:18 crc kubenswrapper[4865]: I1205 07:21:18.415193 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2"} Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.225805 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/util/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.411607 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/util/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.424880 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/pull/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.480858 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/pull/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.632207 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/util/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.640498 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/extract/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.646522 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc2h5t_9466efcc-eb07-4316-a188-5b18e8108180/pull/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.805753 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fmcz8"] Dec 05 07:21:31 crc kubenswrapper[4865]: E1205 07:21:31.806623 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="extract-utilities" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.806690 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="extract-utilities" Dec 05 07:21:31 crc kubenswrapper[4865]: E1205 07:21:31.806753 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="registry-server" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.806805 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="registry-server" Dec 05 07:21:31 crc kubenswrapper[4865]: E1205 07:21:31.806878 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="extract-content" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.806926 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="extract-content" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.807168 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8bca27-0835-42cf-af59-c9b2238ebeb2" containerName="registry-server" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.808526 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.836362 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmcz8"] Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.894055 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/util/0.log" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.951790 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-catalog-content\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.951985 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-utilities\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:31 crc kubenswrapper[4865]: I1205 07:21:31.952106 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwmr\" (UniqueName: \"kubernetes.io/projected/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-kube-api-access-zhwmr\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.080992 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-utilities\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.081700 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwmr\" (UniqueName: \"kubernetes.io/projected/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-kube-api-access-zhwmr\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.082137 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-catalog-content\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.081628 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-utilities\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.082425 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-catalog-content\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.106071 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwmr\" (UniqueName: \"kubernetes.io/projected/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-kube-api-access-zhwmr\") pod \"redhat-operators-fmcz8\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.142485 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.149752 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/pull/0.log" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.193029 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/pull/0.log" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.228707 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/util/0.log" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.629742 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/extract/0.log" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.668443 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmcz8"] Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.669179 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/util/0.log" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.738271 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r76zb_24eb4b5e-4be2-4f7b-8dc7-c72e693de7ec/pull/0.log" Dec 05 07:21:32 crc kubenswrapper[4865]: I1205 07:21:32.937279 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-utilities/0.log" Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.218655 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-utilities/0.log" Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.288767 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-content/0.log" Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.313947 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-content/0.log" Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.577986 4865 generic.go:334] "Generic (PLEG): container finished" podID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerID="e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209" exitCode=0 Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.578890 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerDied","Data":"e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209"} Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.578948 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerStarted","Data":"ef3e34d7c93a3e1627aabdd4aef8abb2cc5f748881a1955e9faad9ce29dce543"} Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.659995 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-utilities/0.log" Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.688004 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/extract-content/0.log" Dec 05 07:21:33 crc kubenswrapper[4865]: I1205 07:21:33.961700 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-utilities/0.log" Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.327519 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-content/0.log" Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.367677 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vvxh5_be94c16b-4a9a-4ad6-aafc-9879b95fdce6/registry-server/0.log" Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.431063 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-utilities/0.log" Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.443643 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-content/0.log" Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.588610 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerStarted","Data":"98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71"} Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.592216 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-utilities/0.log" Dec 05 07:21:34 crc kubenswrapper[4865]: I1205 07:21:34.627225 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/extract-content/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.003347 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-utilities/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.070596 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bnwqc_688f2ae9-08ac-4eb2-9d51-d6c4f9c3be5d/marketplace-operator/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.274251 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-content/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.336958 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-utilities/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.350601 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-content/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.723732 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-utilities/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.732243 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/extract-content/0.log" Dec 05 07:21:35 crc kubenswrapper[4865]: I1205 07:21:35.904802 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jjwkp_fa31bc96-9494-4d49-b271-7059d1a6d0e0/registry-server/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.072177 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-utilities/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.110565 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-46kd4_f1f44df5-e37a-4e11-995d-91a6d2fc538d/registry-server/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.304124 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-utilities/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.338030 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-content/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.376200 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-content/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.501144 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-content/0.log" Dec 05 07:21:36 crc kubenswrapper[4865]: I1205 07:21:36.529411 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/extract-utilities/0.log" Dec 05 07:21:37 crc kubenswrapper[4865]: I1205 07:21:37.255941 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f4cmh_4aa63a1a-0917-4c28-9307-4580800618e2/registry-server/0.log" Dec 05 07:21:37 crc kubenswrapper[4865]: I1205 07:21:37.618920 4865 generic.go:334] "Generic (PLEG): container finished" podID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerID="98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71" exitCode=0 Dec 05 07:21:37 crc kubenswrapper[4865]: I1205 07:21:37.618964 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerDied","Data":"98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71"} Dec 05 07:21:38 crc kubenswrapper[4865]: I1205 07:21:38.635046 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerStarted","Data":"caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292"} Dec 05 07:21:38 crc kubenswrapper[4865]: I1205 07:21:38.678494 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fmcz8" podStartSLOduration=3.253358865 podStartE2EDuration="7.678450877s" podCreationTimestamp="2025-12-05 07:21:31 +0000 UTC" firstStartedPulling="2025-12-05 07:21:33.582446736 +0000 UTC m=+5312.862457958" lastFinishedPulling="2025-12-05 07:21:38.007538738 +0000 UTC m=+5317.287549970" observedRunningTime="2025-12-05 07:21:38.666081845 +0000 UTC m=+5317.946093067" watchObservedRunningTime="2025-12-05 07:21:38.678450877 +0000 UTC m=+5317.958462099" Dec 05 07:21:42 crc kubenswrapper[4865]: I1205 07:21:42.143144 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:42 crc kubenswrapper[4865]: I1205 07:21:42.143655 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:43 crc kubenswrapper[4865]: I1205 07:21:43.196175 4865 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fmcz8" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="registry-server" probeResult="failure" output=< Dec 05 07:21:43 crc kubenswrapper[4865]: timeout: failed to connect service ":50051" within 1s Dec 05 07:21:43 crc kubenswrapper[4865]: > Dec 05 07:21:52 crc kubenswrapper[4865]: I1205 07:21:52.198393 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:52 crc kubenswrapper[4865]: I1205 07:21:52.253638 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:52 crc kubenswrapper[4865]: I1205 07:21:52.441969 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmcz8"] Dec 05 07:21:53 crc kubenswrapper[4865]: I1205 07:21:53.758886 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fmcz8" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="registry-server" containerID="cri-o://caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292" gracePeriod=2 Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.495884 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.559526 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-catalog-content\") pod \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.559723 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-utilities\") pod \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.559797 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhwmr\" (UniqueName: \"kubernetes.io/projected/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-kube-api-access-zhwmr\") pod \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\" (UID: \"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b\") " Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.560857 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-utilities" (OuterVolumeSpecName: "utilities") pod "6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" (UID: "6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.568410 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-kube-api-access-zhwmr" (OuterVolumeSpecName: "kube-api-access-zhwmr") pod "6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" (UID: "6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b"). InnerVolumeSpecName "kube-api-access-zhwmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.661959 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.661996 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhwmr\" (UniqueName: \"kubernetes.io/projected/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-kube-api-access-zhwmr\") on node \"crc\" DevicePath \"\"" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.683709 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" (UID: "6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.763905 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.775003 4865 generic.go:334] "Generic (PLEG): container finished" podID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerID="caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292" exitCode=0 Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.775056 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerDied","Data":"caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292"} Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.775085 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmcz8" event={"ID":"6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b","Type":"ContainerDied","Data":"ef3e34d7c93a3e1627aabdd4aef8abb2cc5f748881a1955e9faad9ce29dce543"} Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.775101 4865 scope.go:117] "RemoveContainer" containerID="caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.775220 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmcz8" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.832543 4865 scope.go:117] "RemoveContainer" containerID="98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.852975 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmcz8"] Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.871867 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fmcz8"] Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.895207 4865 scope.go:117] "RemoveContainer" containerID="e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.935513 4865 scope.go:117] "RemoveContainer" containerID="caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292" Dec 05 07:21:55 crc kubenswrapper[4865]: E1205 07:21:55.937252 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292\": container with ID starting with caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292 not found: ID does not exist" containerID="caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.937307 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292"} err="failed to get container status \"caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292\": rpc error: code = NotFound desc = could not find container \"caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292\": container with ID starting with caa082b097899f20d8418d88c55f543229a09187e218e6bfd8dbc99e9bbac292 not found: ID does not exist" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.937336 4865 scope.go:117] "RemoveContainer" containerID="98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71" Dec 05 07:21:55 crc kubenswrapper[4865]: E1205 07:21:55.938628 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71\": container with ID starting with 98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71 not found: ID does not exist" containerID="98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.938657 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71"} err="failed to get container status \"98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71\": rpc error: code = NotFound desc = could not find container \"98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71\": container with ID starting with 98bf606ef5cf05d71dcbb763a486711961f66f3a6ebc1009fbf444611f54ff71 not found: ID does not exist" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.938673 4865 scope.go:117] "RemoveContainer" containerID="e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209" Dec 05 07:21:55 crc kubenswrapper[4865]: E1205 07:21:55.939027 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209\": container with ID starting with e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209 not found: ID does not exist" containerID="e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209" Dec 05 07:21:55 crc kubenswrapper[4865]: I1205 07:21:55.939048 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209"} err="failed to get container status \"e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209\": rpc error: code = NotFound desc = could not find container \"e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209\": container with ID starting with e0a84ee58e36da8e1d12e68e424b22040da4ad448833fd894fb8d5d3ef944209 not found: ID does not exist" Dec 05 07:21:57 crc kubenswrapper[4865]: I1205 07:21:57.017290 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" path="/var/lib/kubelet/pods/6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b/volumes" Dec 05 07:23:40 crc kubenswrapper[4865]: I1205 07:23:40.816814 4865 scope.go:117] "RemoveContainer" containerID="9e93f511a5e9b539f30762189e6c1450ba33d4f309dc2b4a6d084e8a1b3cbc0b" Dec 05 07:23:41 crc kubenswrapper[4865]: I1205 07:23:41.048801 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:23:41 crc kubenswrapper[4865]: I1205 07:23:41.048872 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:23:52 crc kubenswrapper[4865]: I1205 07:23:52.183959 4865 generic.go:334] "Generic (PLEG): container finished" podID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerID="de5f8d6ae803d052135402712ef336ada74774de0acca7c448f444a5bebc4d9e" exitCode=0 Dec 05 07:23:52 crc kubenswrapper[4865]: I1205 07:23:52.184031 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" event={"ID":"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c","Type":"ContainerDied","Data":"de5f8d6ae803d052135402712ef336ada74774de0acca7c448f444a5bebc4d9e"} Dec 05 07:23:52 crc kubenswrapper[4865]: I1205 07:23:52.185511 4865 scope.go:117] "RemoveContainer" containerID="de5f8d6ae803d052135402712ef336ada74774de0acca7c448f444a5bebc4d9e" Dec 05 07:23:52 crc kubenswrapper[4865]: I1205 07:23:52.302697 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lbvjn_must-gather-r8gjf_5b56c3ad-c323-4c60-8fff-ff6c79f5b63c/gather/0.log" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.050711 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lbvjn/must-gather-r8gjf"] Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.052872 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="copy" containerID="cri-o://ff864269736f60482dd2a12ac16e741abad4eaed30d7834c226b3c3133ba7ce5" gracePeriod=2 Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.078950 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lbvjn/must-gather-r8gjf"] Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.344684 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lbvjn_must-gather-r8gjf_5b56c3ad-c323-4c60-8fff-ff6c79f5b63c/copy/0.log" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.348336 4865 generic.go:334] "Generic (PLEG): container finished" podID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerID="ff864269736f60482dd2a12ac16e741abad4eaed30d7834c226b3c3133ba7ce5" exitCode=143 Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.575077 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lbvjn_must-gather-r8gjf_5b56c3ad-c323-4c60-8fff-ff6c79f5b63c/copy/0.log" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.575457 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.736091 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrhd\" (UniqueName: \"kubernetes.io/projected/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-kube-api-access-smrhd\") pod \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.736150 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-must-gather-output\") pod \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\" (UID: \"5b56c3ad-c323-4c60-8fff-ff6c79f5b63c\") " Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.755970 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-kube-api-access-smrhd" (OuterVolumeSpecName: "kube-api-access-smrhd") pod "5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" (UID: "5b56c3ad-c323-4c60-8fff-ff6c79f5b63c"). InnerVolumeSpecName "kube-api-access-smrhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.837885 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrhd\" (UniqueName: \"kubernetes.io/projected/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-kube-api-access-smrhd\") on node \"crc\" DevicePath \"\"" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.930849 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" (UID: "5b56c3ad-c323-4c60-8fff-ff6c79f5b63c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:24:05 crc kubenswrapper[4865]: I1205 07:24:05.939795 4865 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 07:24:06 crc kubenswrapper[4865]: I1205 07:24:06.358108 4865 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lbvjn_must-gather-r8gjf_5b56c3ad-c323-4c60-8fff-ff6c79f5b63c/copy/0.log" Dec 05 07:24:06 crc kubenswrapper[4865]: I1205 07:24:06.358577 4865 scope.go:117] "RemoveContainer" containerID="ff864269736f60482dd2a12ac16e741abad4eaed30d7834c226b3c3133ba7ce5" Dec 05 07:24:06 crc kubenswrapper[4865]: I1205 07:24:06.358613 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lbvjn/must-gather-r8gjf" Dec 05 07:24:06 crc kubenswrapper[4865]: I1205 07:24:06.387645 4865 scope.go:117] "RemoveContainer" containerID="de5f8d6ae803d052135402712ef336ada74774de0acca7c448f444a5bebc4d9e" Dec 05 07:24:07 crc kubenswrapper[4865]: I1205 07:24:07.027265 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" path="/var/lib/kubelet/pods/5b56c3ad-c323-4c60-8fff-ff6c79f5b63c/volumes" Dec 05 07:24:11 crc kubenswrapper[4865]: I1205 07:24:11.050206 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:24:11 crc kubenswrapper[4865]: I1205 07:24:11.050735 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.308272 4865 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2fj79"] Dec 05 07:24:30 crc kubenswrapper[4865]: E1205 07:24:30.309175 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="gather" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309191 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="gather" Dec 05 07:24:30 crc kubenswrapper[4865]: E1205 07:24:30.309204 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="copy" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309210 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="copy" Dec 05 07:24:30 crc kubenswrapper[4865]: E1205 07:24:30.309231 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="extract-content" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309237 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="extract-content" Dec 05 07:24:30 crc kubenswrapper[4865]: E1205 07:24:30.309251 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="extract-utilities" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309259 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="extract-utilities" Dec 05 07:24:30 crc kubenswrapper[4865]: E1205 07:24:30.309276 4865 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="registry-server" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309283 4865 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="registry-server" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309466 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="copy" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309490 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd29cb3-2aca-4702-a9b7-0364bb7f3e5b" containerName="registry-server" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.309499 4865 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b56c3ad-c323-4c60-8fff-ff6c79f5b63c" containerName="gather" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.311074 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.327207 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2fj79"] Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.421780 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8sz\" (UniqueName: \"kubernetes.io/projected/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-kube-api-access-6d8sz\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.421883 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-utilities\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.421917 4865 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-catalog-content\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.523287 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-catalog-content\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.523428 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8sz\" (UniqueName: \"kubernetes.io/projected/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-kube-api-access-6d8sz\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.523494 4865 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-utilities\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.523810 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-catalog-content\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.523889 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-utilities\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.550636 4865 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8sz\" (UniqueName: \"kubernetes.io/projected/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-kube-api-access-6d8sz\") pod \"certified-operators-2fj79\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:30 crc kubenswrapper[4865]: I1205 07:24:30.643894 4865 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:31 crc kubenswrapper[4865]: I1205 07:24:31.187155 4865 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2fj79"] Dec 05 07:24:31 crc kubenswrapper[4865]: I1205 07:24:31.632360 4865 generic.go:334] "Generic (PLEG): container finished" podID="f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" containerID="595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44" exitCode=0 Dec 05 07:24:31 crc kubenswrapper[4865]: I1205 07:24:31.632455 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fj79" event={"ID":"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2","Type":"ContainerDied","Data":"595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44"} Dec 05 07:24:31 crc kubenswrapper[4865]: I1205 07:24:31.632636 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fj79" event={"ID":"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2","Type":"ContainerStarted","Data":"85f12808c5de7d0142c512e49588f27e8efdf105e8dc97752aa3bb5405eca345"} Dec 05 07:24:33 crc kubenswrapper[4865]: I1205 07:24:33.650967 4865 generic.go:334] "Generic (PLEG): container finished" podID="f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" containerID="4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422" exitCode=0 Dec 05 07:24:33 crc kubenswrapper[4865]: I1205 07:24:33.651048 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fj79" event={"ID":"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2","Type":"ContainerDied","Data":"4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422"} Dec 05 07:24:34 crc kubenswrapper[4865]: I1205 07:24:34.661861 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fj79" event={"ID":"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2","Type":"ContainerStarted","Data":"aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba"} Dec 05 07:24:34 crc kubenswrapper[4865]: I1205 07:24:34.691003 4865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2fj79" podStartSLOduration=2.280208889 podStartE2EDuration="4.69098553s" podCreationTimestamp="2025-12-05 07:24:30 +0000 UTC" firstStartedPulling="2025-12-05 07:24:31.635956619 +0000 UTC m=+5490.915967841" lastFinishedPulling="2025-12-05 07:24:34.04673326 +0000 UTC m=+5493.326744482" observedRunningTime="2025-12-05 07:24:34.685678239 +0000 UTC m=+5493.965689461" watchObservedRunningTime="2025-12-05 07:24:34.69098553 +0000 UTC m=+5493.970996752" Dec 05 07:24:40 crc kubenswrapper[4865]: I1205 07:24:40.644434 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:40 crc kubenswrapper[4865]: I1205 07:24:40.644966 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:40 crc kubenswrapper[4865]: I1205 07:24:40.698683 4865 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:40 crc kubenswrapper[4865]: I1205 07:24:40.765752 4865 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:40 crc kubenswrapper[4865]: I1205 07:24:40.871171 4865 scope.go:117] "RemoveContainer" containerID="6864cf4da0b2f80f827acccfbe3c655b666f6ca7d41fd07f2f22b3bb41328db8" Dec 05 07:24:40 crc kubenswrapper[4865]: I1205 07:24:40.950535 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2fj79"] Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.048488 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.048551 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.048591 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.049302 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.049350 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2" gracePeriod=600 Dec 05 07:24:41 crc kubenswrapper[4865]: E1205 07:24:41.536766 4865 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1356a0a_4e64_49b5_b640_3779d3abe333.slice/crio-4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2.scope\": RecentStats: unable to find data in memory cache]" Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.730431 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2" exitCode=0 Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.730519 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2"} Dec 05 07:24:41 crc kubenswrapper[4865]: I1205 07:24:41.731416 4865 scope.go:117] "RemoveContainer" containerID="cc94818a2af2e6fbf8bd154415f612e18cf8e203f05f7e073ebb13b702112392" Dec 05 07:24:42 crc kubenswrapper[4865]: I1205 07:24:42.740363 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerStarted","Data":"e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431"} Dec 05 07:24:42 crc kubenswrapper[4865]: I1205 07:24:42.740569 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2fj79" podUID="f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" containerName="registry-server" containerID="cri-o://aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba" gracePeriod=2 Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.181555 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.264477 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d8sz\" (UniqueName: \"kubernetes.io/projected/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-kube-api-access-6d8sz\") pod \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.264549 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-utilities\") pod \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.264689 4865 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-catalog-content\") pod \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\" (UID: \"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2\") " Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.268081 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-utilities" (OuterVolumeSpecName: "utilities") pod "f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" (UID: "f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.296711 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-kube-api-access-6d8sz" (OuterVolumeSpecName: "kube-api-access-6d8sz") pod "f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" (UID: "f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2"). InnerVolumeSpecName "kube-api-access-6d8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.367753 4865 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d8sz\" (UniqueName: \"kubernetes.io/projected/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-kube-api-access-6d8sz\") on node \"crc\" DevicePath \"\"" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.367789 4865 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.606444 4865 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" (UID: "f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.673568 4865 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.753736 4865 generic.go:334] "Generic (PLEG): container finished" podID="f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" containerID="aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba" exitCode=0 Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.753869 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fj79" event={"ID":"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2","Type":"ContainerDied","Data":"aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba"} Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.753947 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fj79" event={"ID":"f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2","Type":"ContainerDied","Data":"85f12808c5de7d0142c512e49588f27e8efdf105e8dc97752aa3bb5405eca345"} Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.753986 4865 scope.go:117] "RemoveContainer" containerID="aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.753881 4865 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fj79" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.795777 4865 scope.go:117] "RemoveContainer" containerID="4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.833419 4865 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2fj79"] Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.845719 4865 scope.go:117] "RemoveContainer" containerID="595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.851690 4865 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2fj79"] Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.883462 4865 scope.go:117] "RemoveContainer" containerID="aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba" Dec 05 07:24:43 crc kubenswrapper[4865]: E1205 07:24:43.884097 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba\": container with ID starting with aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba not found: ID does not exist" containerID="aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.884155 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba"} err="failed to get container status \"aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba\": rpc error: code = NotFound desc = could not find container \"aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba\": container with ID starting with aff9f1c672326f688c4b85c1b7e38954ed5a2c9c15b86cd7b0215256d80a48ba not found: ID does not exist" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.884189 4865 scope.go:117] "RemoveContainer" containerID="4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422" Dec 05 07:24:43 crc kubenswrapper[4865]: E1205 07:24:43.884721 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422\": container with ID starting with 4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422 not found: ID does not exist" containerID="4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.884782 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422"} err="failed to get container status \"4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422\": rpc error: code = NotFound desc = could not find container \"4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422\": container with ID starting with 4bbed8c9dbf4d4b3163caa2e87f36f74d663ae56855e11abf9101024619e5422 not found: ID does not exist" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.884843 4865 scope.go:117] "RemoveContainer" containerID="595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44" Dec 05 07:24:43 crc kubenswrapper[4865]: E1205 07:24:43.885146 4865 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44\": container with ID starting with 595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44 not found: ID does not exist" containerID="595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44" Dec 05 07:24:43 crc kubenswrapper[4865]: I1205 07:24:43.885174 4865 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44"} err="failed to get container status \"595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44\": rpc error: code = NotFound desc = could not find container \"595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44\": container with ID starting with 595c65869f079db0557d1a94802818814ca10601d7adb3558db412b6ccfe0b44 not found: ID does not exist" Dec 05 07:24:45 crc kubenswrapper[4865]: I1205 07:24:45.018875 4865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2" path="/var/lib/kubelet/pods/f471b3d6-ff7a-4b72-8b7a-c8a06b2c93a2/volumes" Dec 05 07:26:41 crc kubenswrapper[4865]: I1205 07:26:41.048635 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:26:41 crc kubenswrapper[4865]: I1205 07:26:41.050063 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:27:11 crc kubenswrapper[4865]: I1205 07:27:11.049228 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:27:11 crc kubenswrapper[4865]: I1205 07:27:11.049930 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.050252 4865 patch_prober.go:28] interesting pod/machine-config-daemon-hhx2r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.051104 4865 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.051195 4865 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.052983 4865 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431"} pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.053153 4865 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerName="machine-config-daemon" containerID="cri-o://e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" gracePeriod=600 Dec 05 07:27:41 crc kubenswrapper[4865]: E1205 07:27:41.202857 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.845257 4865 generic.go:334] "Generic (PLEG): container finished" podID="c1356a0a-4e64-49b5-b640-3779d3abe333" containerID="e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" exitCode=0 Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.845309 4865 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" event={"ID":"c1356a0a-4e64-49b5-b640-3779d3abe333","Type":"ContainerDied","Data":"e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431"} Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.845702 4865 scope.go:117] "RemoveContainer" containerID="4d1389b70ebba2de7005265e4f233429a58a08f02863518672f7702a86cffde2" Dec 05 07:27:41 crc kubenswrapper[4865]: I1205 07:27:41.846375 4865 scope.go:117] "RemoveContainer" containerID="e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" Dec 05 07:27:41 crc kubenswrapper[4865]: E1205 07:27:41.846646 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:27:56 crc kubenswrapper[4865]: I1205 07:27:56.007374 4865 scope.go:117] "RemoveContainer" containerID="e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" Dec 05 07:27:56 crc kubenswrapper[4865]: E1205 07:27:56.008244 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:28:10 crc kubenswrapper[4865]: I1205 07:28:10.006795 4865 scope.go:117] "RemoveContainer" containerID="e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" Dec 05 07:28:10 crc kubenswrapper[4865]: E1205 07:28:10.007608 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:28:21 crc kubenswrapper[4865]: I1205 07:28:21.022858 4865 scope.go:117] "RemoveContainer" containerID="e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" Dec 05 07:28:21 crc kubenswrapper[4865]: E1205 07:28:21.024126 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333" Dec 05 07:28:33 crc kubenswrapper[4865]: I1205 07:28:33.007209 4865 scope.go:117] "RemoveContainer" containerID="e585e99087e7860361c4008f013cfbf7ee403d1acd234105dda86f6310ac5431" Dec 05 07:28:33 crc kubenswrapper[4865]: E1205 07:28:33.008084 4865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hhx2r_openshift-machine-config-operator(c1356a0a-4e64-49b5-b640-3779d3abe333)\"" pod="openshift-machine-config-operator/machine-config-daemon-hhx2r" podUID="c1356a0a-4e64-49b5-b640-3779d3abe333"